904 resultados para Optimisation of methods
Resumo:
As the methodologies available for the detection of positive selection from genomic data vary in terms of assumptions and execution, weak correlations are expected among them. However, if there is any given signal that is consistently supported across different methodologies, it is strong evidence that the locus has been under past selection. In this paper, a straightforward frequentist approach based on the Stouffer Method to combine P-values across different tests for evidence of recent positive selection in common variations, as well as strategies for extracting biological information from the detected signals, were described and applied to high density single nucleotide polymorphism (SNP) data generated from dairy and beef cattle (taurine and indicine). The ancestral Bovinae allele state of over 440,000 SNP is also reported. Using this combination of methods, highly significant (P<3.17×10-7) population-specific sweeps pointing out to candidate genes and pathways that may be involved in beef and dairy production were identified. The most significant signal was found in the Cornichon homolog 3 gene (CNIH3) in Brown Swiss (P = 3.82×10-12), and may be involved in the regulation of pre-ovulatory luteinizing hormone surge. Other putative pathways under selection are the glucolysis/gluconeogenesis, transcription machinery and chemokine/cytokine activity in Angus; calpain-calpastatin system and ribosome biogenesis in Brown Swiss; and gangliosides deposition in milk fat globules in Gyr. The composite method, combined with the strategies applied to retrieve functional information, may be a useful tool for surveying genome-wide selective sweeps and providing insights in to the source of selection.
Resumo:
Studies on innovation and technology management have emphasized the importance of integration between the research and development (R&D) department and others involved with the product development process (PDP) as a relevant practice for the good performance of technological innovation of product activities. This study addresses the topic of transfers of technologies to new product projects and also integration practices between the R&D department and others involved with the PDP. A qualitative study was conducted that was operationalized through two case studies at large high-tech companies: One is Brazilian and the other is a multinational subsidiary in Brazil. Among its main result, this paper represents and analyzes management practices that are favorable to integration in product development projects that demand development and transfer of technologies, such as: participation of R&D personnel in market activities, the adoption of virtual interaction mechanisms, and the application of methods such as technology roadmaps. © Universidad Alberto Hurtado, Facultad de Economía y Negocios.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The incidence of encephalic tumors in dogs and cats has increased in recent years due to the constant advancement of methods of specialist Diagnostic Imaging: Magnetic Resonance Imaging (MRI) and Computed Tomography (CT), used in small animals. These tools, which were distant in the past, are now becoming increasingly important as an additional aid to the identification of tumor processes in the Central Nervous System. The objective, of the present study, was describe imaging findings obtained in 32 cases of encephalic tumors, through techniques of CT and MR imaging procedures during the years 2004 to 2011. Were diagnosed 19/32 by MRI and 13/32 by CT, being the most affected breed Boxer (9/32), the mean age was 10 years.
Resumo:
This study aimed to control different populations of Digitaria insularis by glyphosate herbicide, isolated and mixed, besides the combination of methods (chemical and mechanical) to manage resistant adult plants. Three experiments were conducted, one in pots which were maintained under non-controlled conditions and two under field conditions. In the experiment in pots, twelve populations of D. insularis were sprayed with isolated glyphosate (1.44 and 2.16 kg a.e. ha(-1)) and mixed (1.44 and 2.16 kg a.e. ha(-1)) with quizalofop-p tefuryl (0.12 kg i.a. ha(-1)). The treatment of 1.44 kg a.e. ha(-1) of glyphosate plus 0.12 kg a.i. ha(-1) of quizalofop was sufficient for adequate control (>95%) of all populations. Population 11 (area of grain production in Itumbiara, GO) was considered sensitive to glyphosate. Others populations were moderately sensitive or tolerant to the herbicide. In the field, the plants of D. insularis of one of the experiments were mowed and, in the other, there were not. Eight treatments with herbicides [isolated glyphosate (1.44 and 2.16 kg a.e. ha(-1)) and mixed (1.44 and 2.16 kg a.e. ha(-1)) with quizalofop-p-tefuryl at 0.12 kg a.i. ha(-1)), clethodim at 0.108 kg a.i. ha(-1)) or nicosulfuron at 0.06 kg a.i. ha(-1))] were assessed, in combination with or without sequential application of the standard treatment, sprayed 15 days after the first application. The combination of the mechanic control with the application of glyphosate (2.16 and 1.44 kg a.e. ha(-1)) plus quizalofop-p-tefuryl (0.12 kg a.i. ha(-1)) or clethodim (0.108 kg a.i. ha(-1)), associated to the sequential application, was the most effective strategy for the management of adult plants of resistant D. insularis.
Resumo:
In the city of Sao Paulo, where about 11 million people live, landslides and flooding occur frequently, especially during the summer. These landslides cause the destruction of houses and urban equipment, economic damage, and the loss of lives. The number of areas threatened by landslides has been increasing each year. The objective of this article is to analyze the probability of risk and susceptibility to shallow landslides in the Limoeiro River basin, which is located at the head of the Aricanduva River basin, one of the main hydrographic basins in the city of Sao Paulo. To map areas of risk, we created a cadastral survey form to evaluate landslide risk in the field. Risk was categorized into four levels based on natural and anthropogenic factors: R1 (low risk), R2 (average risk), R3 (high risk), and R4 (very high risk). To analyze susceptibility to shallow landslides, we used the SHALSTAB (Shallow Landsliding Stability) mathematical model and calculated the Distribution Frequency (DF) of the susceptibility classes for the entire basin. Finally, we performed a joint analysis of the average Risk Concentration (RC) and Risk Potential (RP). We mapped 14 risk sectors containing approximately 685 at-risk homes, more than half of which presented a high (R3) or very high (R4) probability of risk to the population. In the susceptibility map, 41% of the area was classified as stable and 20% as unconditionally unstable. Although the latter category accounted a smaller proportion of the total area, it contained a concentration (RC) of 41% of the mapped risk areas with a risk potential (RP) of 12%. We found that the locations of areas predicted to be unstable by the model coincided with the risk areas mapped in the field. This combination of methods can be applied to evaluate the risk of shallow landslides in densely populated areas and can assist public managers in defining areas that are unstable and inappropriate for occupation. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
OBJECTIVE: This study aimed to investigate the frequency of positive results for hepatitis B and C, HIV and syphilis in blood donations at the Centro Regional de Hemoterapia de Ribeirão Preto, to describe donors with positive results according to some demographic and socioeconomic variables, to identify risk factors associated to these donors and the reasons that they were not detected during clinical screening. METHODS: A descriptive study was performed between July 1st 2005 and July 31st 2006 by interviewing 106 donors after medical consultations where they were informed of positive results for hepatitis B, hepatitis C, HIV or syphilis. RESULTS: There was a predominance of first-time donors, males, under 50-year olds, married individuals, from Ribeirão Preto, with elementary education, low economic status and of people who donated at the request of friends or relatives. Hepatitis C was the most frequently detected infection (56.6%), followed by hepatitis B (20.7%), HIV (12.3%) and syphilis(10.4%). About 40% of donors had omitted risk factors for different reasons: because they trusted the results of serological tests, did not feel comfortable about talking of risk factors or did not consider them relevant. Other justifications were the duration of the interview, the interviewer was unskilled, embarrassment and doubts about confidentiality. CONCLUSION: The results indicate the need for changes in the approach to clinical screening and a review of methods to attract and guide potential donors.
Resumo:
The objective of this thesis is to improve the understanding of what processes and mechanism affects the distribution of polychlorinated biphenyls (PCBs) and organic carbon in coastal sediments. Because of the strong association of hydrophobic organic contaminants (HOCs) such as PCBs with organic matter in the aquatic environment, these two entities are naturally linked. The coastal environment is the most complex and dynamic part of the ocean when it comes to both cycling of organic matter and HOCs. This environment is characterised by the largest fluxes and most diverse sources of both entities. A wide array of methods was used to study these processes throughout this thesis. In the field sites in the Stockholm archipelago of the Baltic proper, bottom sediments and settling particulate matter were retrieved using sediment coring devices and sediment traps from morphometrically and seismically well-characterized locations. In the laboratory, the samples have been analysed for PCBs, stable carbon isotope ratios, carbon-nitrogen atom ratios as well as standard sediment properties. From the fieldwork in the Stockholm Archipelago and the following laboratory work it was concluded that the inner Stockholm archipelago has a low (≈ 4%) trapping efficiency for freshwater-derived organic carbon. The corollary is a large potential for long-range waterborne transport of OC and OC-associated nutrients and hydrophobic organic pollutants from urban Stockholm to more pristine offshore Baltic Sea ecosystems. Theoretical work has been carried out using Geographical Information Systems (GIS) and statistical methods on a database of 4214 individual sediment samples, each with reported individual PCB congener concentrations. From this work it was concluded that the continental shelf sediments are key global inventories and ultimate sinks of PCBs. Depending on congener, 10-80% of the cumulative historical emissions to the environment are accounted for in continental shelf sediments. Further it was concluded that the many infamous and highly contaminated surface sediments of urban harbours and estuaries of contaminated rivers cannot be of importance as a secondary source to sustain the concentrations observed in remote sediments. Of the global shelf PCB inventory < 1% are in sediments near population centres while ≥ 90% is in remote areas (> 10 km from any dwellings). The remote sub-basin of the North Atlantic Ocean contains approximately half of the global shelf sediment inventory for most of the PCBs studied.
Resumo:
Trabajo realizado por: Garijo, J. C., Hernández León, S.
Resumo:
[EN] We analyze the discontinuity preserving problem in TV-L1 optical flow methods. This type of methods typically creates rounded effects at flow boundaries, which usually do not coincide with object contours. A simple strategy to overcome this problem consists in inhibiting the diffusion at high image gradients. In this work, we first introduce a general framework for TV regularizers in optical flow and relate it with some standard approaches. Our survey takes into account several methods that use decreasing functions for mitigating the diffusion at image contours. Consequently, this kind of strategies may produce instabilities in the estimation of the optical flows. Hence, we study the problem of instabilities and show that it actually arises from an ill-posed formulation. From this study, it is possible to come across with different schemes to solve this problem. One of these consists in separating the pure TV process from the mitigating strategy. This has been used in another work and we demonstrate here that it has a good performance. Furthermore, we propose two alternatives to avoid the instability problems: (i) we study a fully automatic approach that solves the problem based on the information of the whole image; (ii) we derive a semi-automatic approach that takes into account the image gradients in a close neighborhood adapting the parameter in each position. In the experimental results, we present a detailed study and comparison between the different alternatives. These methods provide very good results, especially for sequences with a few dominant gradients. Additionally, a surprising effect of these approaches is that they can cope with occlusions. This can be easily achieved by using strong regularizations and high penalizations at image contours.
Resumo:
Aerosol particles and water vapour are two important constituents of the atmosphere. Their interaction, i.e. thecondensation of water vapour on particles, brings about the formation of cloud, fog, and raindrops, causing the water cycle on the earth, and being responsible for climate changes. Understanding the roles of water vapour and aerosol particles in this interaction has become an essential part of understanding the atmosphere. In this work, the heterogeneous nucleation on pre-existing aerosol particles by the condensation of water vapour in theflow of a capillary nozzle was investigated. Theoretical and numerical modelling as well as experiments on thiscondensation process were included. Based on reasonable results from the theoretical and numerical modelling, an idea of designing a new nozzle condensation nucleus counter (Nozzle-CNC), that is to utilise the capillary nozzle to create an expanding water saturated air flow, was then put forward and various experiments were carried out with this Nozzle-CNC under different experimental conditions. Firstly, the air stream in the long capillary nozzle with inner diameter of 1.0~mm was modelled as a steady, compressible and heat-conducting turbulence flow by CFX-FLOW3D computational program. An adiabatic and isentropic cooling in the nozzle was found. A supersaturation in the nozzle can be created if the inlet flow is water saturated, and its value depends principally on flow velocity or flow rate through the nozzle. Secondly, a particle condensational growth model in air stream was developed. An extended Mason's diffusion growthequation with size correction for particles beyond the continuum regime and with the correction for a certain particle Reynolds number in an accelerating state was given. The modelling results show the rapid condensational growth of aerosol particles, especially for fine size particles, in the nozzle stream, which, on the one hand, may induce evident `over-sizing' and `over-numbering' effects in aerosol measurements as nozzle designs are widely employed for producing accelerating and focused aerosol beams in aerosol instruments like optical particle counter (OPC) and aerodynamical particle sizer (APS). It can, on the other hand, be applied in constructing the Nozzle-CNC. Thirdly, based on the optimisation of theoretical and numerical results, the new Nozzle-CNC was built. Under various experimental conditions such as flow rate, ambient temperature, and the fraction of aerosol in the total flow, experiments with this instrument were carried out. An interesting exponential relation between the saturation in the nozzle and the number concentration of atmospheric nuclei, including hygroscopic nuclei (HN), cloud condensation nuclei (CCN), and traditionally measured atmospheric condensation nuclei (CN), was found. This relation differs from the relation for the number concentration of CCN obtained by other researchers. The minimum detectable size of this Nozzle-CNC is 0.04?m. Although further improvements are still needed, this Nozzle-CNC, in comparison with other CNCs, has severaladvantages such as no condensation delay as particles larger than the critical size grow simultaneously, low diffusion losses of particles, little water condensation at the inner wall of the instrument, and adjustable saturation --- therefore the wide counting region, as well as no calibration compared to non-water condensation substances.
Resumo:
Nowadays, it is clear that the target of creating a sustainable future for the next generations requires to re-think the industrial application of chemistry. It is also evident that more sustainable chemical processes may be economically convenient, in comparison with the conventional ones, because fewer by-products means lower costs for raw materials, for separation and for disposal treatments; but also it implies an increase of productivity and, as a consequence, smaller reactors can be used. In addition, an indirect gain could derive from the better public image of the company, marketing sustainable products or processes. In this context, oxidation reactions play a major role, being the tool for the production of huge quantities of chemical intermediates and specialties. Potentially, the impact of these productions on the environment could have been much worse than it is, if a continuous efforts hadn’t been spent to improve the technologies employed. Substantial technological innovations have driven the development of new catalytic systems, the improvement of reactions and process technologies, contributing to move the chemical industry in the direction of a more sustainable and ecological approach. The roadmap for the application of these concepts includes new synthetic strategies, alternative reactants, catalysts heterogenisation and innovative reactor configurations and process design. Actually, in order to implement all these ideas into real projects, the development of more efficient reactions is one primary target. Yield, selectivity and space-time yield are the right metrics for evaluating the reaction efficiency. In the case of catalytic selective oxidation, the control of selectivity has always been the principal issue, because the formation of total oxidation products (carbon oxides) is thermodynamically more favoured than the formation of the desired, partially oxidized compound. As a matter of fact, only in few oxidation reactions a total, or close to total, conversion is achieved, and usually the selectivity is limited by the formation of by-products or co-products, that often implies unfavourable process economics; moreover, sometimes the cost of the oxidant further penalizes the process. During my PhD work, I have investigated four reactions that are emblematic of the new approaches used in the chemical industry. In the Part A of my thesis, a new process aimed at a more sustainable production of menadione (vitamin K3) is described. The “greener” approach includes the use of hydrogen peroxide in place of chromate (from a stoichiometric oxidation to a catalytic oxidation), also avoiding the production of dangerous waste. Moreover, I have studied the possibility of using an heterogeneous catalytic system, able to efficiently activate hydrogen peroxide. Indeed, the overall process would be carried out in two different steps: the first is the methylation of 1-naphthol with methanol to yield 2-methyl-1-naphthol, the second one is the oxidation of the latter compound to menadione. The catalyst for this latter step, the reaction object of my investigation, consists of Nb2O5-SiO2 prepared with the sol-gel technique. The catalytic tests were first carried out under conditions that simulate the in-situ generation of hydrogen peroxide, that means using a low concentration of the oxidant. Then, experiments were carried out using higher hydrogen peroxide concentration. The study of the reaction mechanism was fundamental to get indications about the best operative conditions, and improve the selectivity to menadione. In the Part B, I explored the direct oxidation of benzene to phenol with hydrogen peroxide. The industrial process for phenol is the oxidation of cumene with oxygen, that also co-produces acetone. This can be considered a case of how economics could drive the sustainability issue; in fact, the new process allowing to obtain directly phenol, besides avoiding the co-production of acetone (a burden for phenol, because the market requirements for the two products are quite different), might be economically convenient with respect to the conventional process, if a high selectivity to phenol were obtained. Titanium silicalite-1 (TS-1) is the catalyst chosen for this reaction. Comparing the reactivity results obtained with some TS-1 samples having different chemical-physical properties, and analyzing in detail the effect of the more important reaction parameters, we could formulate some hypothesis concerning the reaction network and mechanism. Part C of my thesis deals with the hydroxylation of phenol to hydroquinone and catechol. This reaction is already industrially applied but, for economical reason, an improvement of the selectivity to the para di-hydroxilated compound and a decrease of the selectivity to the ortho isomer would be desirable. Also in this case, the catalyst used was the TS-1. The aim of my research was to find out a method to control the selectivity ratio between the two isomers, and finally to make the industrial process more flexible, in order to adapt the process performance in function of fluctuations of the market requirements. The reaction was carried out in both a batch stirred reactor and in a re-circulating fixed-bed reactor. In the first system, the effect of various reaction parameters on catalytic behaviour was investigated: type of solvent or co-solvent, and particle size. With the second reactor type, I investigated the possibility to use a continuous system, and the catalyst shaped in extrudates (instead of powder), in order to avoid the catalyst filtration step. Finally, part D deals with the study of a new process for the valorisation of glycerol, by means of transformation into valuable chemicals. This molecule is nowadays produced in big amount, being a co-product in biodiesel synthesis; therefore, it is considered a raw material from renewable resources (a bio-platform molecule). Initially, we tested the oxidation of glycerol in the liquid-phase, with hydrogen peroxide and TS-1. However, results achieved were not satisfactory. Then we investigated the gas-phase transformation of glycerol into acrylic acid, with the intermediate formation of acrolein; the latter can be obtained by dehydration of glycerol, and then can be oxidized into acrylic acid. Actually, the oxidation step from acrolein to acrylic acid is already optimized at an industrial level; therefore, we decided to investigate in depth the first step of the process. I studied the reactivity of heterogeneous acid catalysts based on sulphated zirconia. Tests were carried out both in aerobic and anaerobic conditions, in order to investigate the effect of oxygen on the catalyst deactivation rate (one main problem usually met in glycerol dehydration). Finally, I studied the reactivity of bifunctional systems, made of Keggin-type polyoxometalates, either alone or supported over sulphated zirconia, in this way combining the acid functionality (necessary for the dehydrative step) with the redox one (necessary for the oxidative step). In conclusion, during my PhD work I investigated reactions that apply the “green chemistry” rules and strategies; in particular, I studied new greener approaches for the synthesis of chemicals (Part A and Part B), the optimisation of reaction parameters to make the oxidation process more flexible (Part C), and the use of a bioplatform molecule for the synthesis of a chemical intermediate (Part D).
Resumo:
Members of the genera Campylobacter and Helicobacter have been in the spotlight in recent decades because of their status as animals and/or humans pathogens, both confirmed and emerging, and because of their association with food-borne and zoonotic diseases. First observations of spiral shaped bacteria or Campylobacter-like organisms (CLO) date back to the end of the 19th century, however the lack of adequate isolation methods hampered further research. With the introduction of methods such as selective media and a filtration procedure during the 1970s led to a renewed interest in Campylobacter, especially as this enabled elucidation of their role in human hosts. On the other hand the classification and identification of these bacteria was troublesome, mainly because of the biochemical inertness and fastidious growth requirements. In 1991, the taxonomy of Campylobacter and related organisms was thoroughly revised, since this revision several new Campylobacter and Helicobacter species have been described. Moreover, thanks to the introduction of a polyphasic taxonomic practice, the classification of these novel species is well-founded. Indeed, a polyphasic approach was here followed for characterizing eight isolates obtained from rabbits epidemiologically not correlated and as a result a new Campylobacter species was proposed: Campylobacter cuniculorum (Chapter 1). Furthermore, there is a paucity of data regarding the occurrence of spiral shaped enteric flora in leporids. In order to define the prevalence both of this new species and other CLO in leporids (chapter 2), a total of 85 whole intestinal tracts of rabbits reared in 32 farms and 29 capture hares, epidemiologically not correlated, were collected just after evisceration at the slaughterhouse or during necroscopy. Examination and isolation methods were varied in order to increase the sensibility level of detection, and 100% of rabbit farms resulted positive for C. cuniculorum in high concentrations. Moreover, in 3.53% of the total rabbits examined, a Helicobacter species was detected. Nevertheless, all hares resulted negative both for Campylobacter or Helicobacter species. High prevalence of C. cuniculorum were found in rabbits, and in order to understand if this new species could play a pathological role, a study on some virulence determinants of C. cuniculorum was conducted (Chapter 3). Although this new species were able to adhere and invade, exert cytolethal distending toxin-like effects although at a low titre, a cdtB was not detected. There was no clear relationship between source of isolation or disease manifestation and possession of statistically significantly levels of particular virulence-associated factors although, cell adhesion and invasion occurred. Furthermore, antibiotic susceptibility was studied (chapter 4) in Campylobacter and in Escherichia coli strains, isolated from rabbits. It was possible to find acquired resistance of C. cuniculorum to enrofloxacin, ciprofloxacin and erytromycin. C. coli isolate was susceptible to all antimicrobial tested and moreover it is considered as a wild-type strain. Moreover, E. coli was found at low caecal concentration in rabbits and 30 phenotypes of antibiotic resistance were founded as well as the high rate of resistances to at least one antibiotic (98.1%). The majority of resistances were found from strains belonging to intensive farming system. In conclusion, in the course of the present study a new species isolated from rabbits was described, C. cuniculorum, and its high prevalence was established. Nevertheless, in hare samples no Campylobacter and Helicobacter species were detected. Some virulence determinants were further analyzed, however further studied are needed to understand the potential pathogenicity of this new species. On the other hand, antimicrobial susceptibility was monitored both in C. cuniculorum and indicator bacteria and acquired resistance was observed towards some antibiotics, indicating a possible role of rabbitries in the diffusion of antibiotic resistance. Further studies are necessary to describe and evaluate the eventual zoonotic role of Campylobacter cuniculorum.
Resumo:
The aim of this thesis was to describe the development of motion analysis protocols for applications on upper and lower limb extremities, by using inertial sensors-based systems. Inertial sensors-based systems are relatively recent. Knowledge and development of methods and algorithms for the use of such systems for clinical purposes is therefore limited if compared with stereophotogrammetry. However, their advantages in terms of low cost, portability, small size, are a valid reason to follow this direction. When developing motion analysis protocols based on inertial sensors, attention must be given to several aspects, like the accuracy of inertial sensors-based systems and their reliability. The need to develop specific algorithms/methods and software for using these systems for specific applications, is as much important as the development of motion analysis protocols based on them. For this reason, the goal of the 3-years research project described in this thesis was achieved first of all trying to correctly design the protocols based on inertial sensors, in terms of exploring and developing which features were suitable for the specific application of the protocols. The use of optoelectronic systems was necessary because they provided a gold standard and accurate measurement, which was used as a reference for the validation of the protocols based on inertial sensors. The protocols described in this thesis can be particularly helpful for rehabilitation centers in which the high cost of instrumentation or the limited working areas do not allow the use of stereophotogrammetry. Moreover, many applications requiring upper and lower limb motion analysis to be performed outside the laboratories will benefit from these protocols, for example performing gait analysis along the corridors. Out of the buildings, the condition of steady-state walking or the behavior of the prosthetic devices when encountering slopes or obstacles during walking can also be assessed. The application of inertial sensors on lower limb amputees presents conditions which are challenging for magnetometer-based systems, due to ferromagnetic material commonly adopted for the construction of idraulic components or motors. INAIL Prostheses Centre stimulated and, together with Xsens Technologies B.V. supported the development of additional methods for improving the accuracy of MTx in measuring the 3D kinematics for lower limb prostheses, with the results provided in this thesis. In the author’s opinion, this thesis and the motion analysis protocols based on inertial sensors here described, are a demonstration of how a strict collaboration between the industry, the clinical centers, the research laboratories, can improve the knowledge, exchange know-how, with the common goal to develop new application-oriented systems.
Resumo:
Sports biomechanics describes human movement from a performance enhancement and an injury reduction perspective. In this respect, the purpose of sports scientists is to support coaches and physicians with reliable information about athletes’ technique. The lack of methods allowing for in-field athlete evaluation as well as for accurate joint force estimates represents, to date, the main limitation to this purpose. The investigations illustrated in the present thesis aimed at providing a contribution towards the development of the above mentioned methods. Two complementary approaches were adopted: a Low Resolution Approach – related to performance assessment – where the use of wearable inertial measurement units is exploited during different phases of sprint running, and a High Resolution Approach – related to joint kinetics estimate for injury prevention – where subject-specific, non-rigid constraints for knee joint kinematic modelling used in multi-body optimization techniques are defined. Results obtained using the Low Resolution Approach indicated that, due to their portability and inexpensiveness, inertial measurement systems are a valid alternative to laboratory-based instrumentation for in-field performance evaluation of sprint running. Using acceleration and angular velocity data, the following quantities were estimated: trunk inclination and angular velocity, instantaneous horizontal velocity and displacement of a point approximating the centre of mass, and stride and support phase durations. As concerns the High Resolution Approach, results indicated that the length of the anterior cruciate and lateral collateral ligaments decreased, while that of the deep bundle of the medial collateral ligament increased significantly during flexion. Variations of the posterior cruciate and the superficial bundle of the medial collateral ligament lengths were concealed by the experimental indeterminacy. A mathematical model was provided that allowed the estimate of subject-specific ligament lengths as a function of knee flexion and that can be integrated in a multi-body optimization procedure.