972 resultados para Performance improvements


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was conducted to determine if the use of the technology known as Classroom Performance System (CPS), specifically referred to as "Clickers", improves the learning gains of students enrolled in a biology course for science majors. CPS is one of a group of developing technologies adapted for providing feedback in the classroom using a learner-centered approach. It supports and facilitates discussion among students and between them and teachers, and provides for participation by passive students. Advocates, influenced by constructivist theories, claim increased academic achievement. In science teaching, the results have been mixed, but there is some evidence of improvements in conceptual understanding. The study employed a pretest-posttest, non-equivalent groups experimental design. The sample consisted of 226 participants in six sections of a college biology course at a large community college in South Florida with two instructors trained in the use of clickers. Each instructor randomly selected their sections into CPS (treatment) and non-CPS (control) groups. All participants filled out a survey that included demographic data at the beginning of the semester. The treatment group used clicker questions throughout, with discussions as necessary, whereas the control groups answered the same questions as quizzes, similarly engaging in discussion where necessary. The learning gains were assessed on a pre/post-test basis. The average learning gains, defined as the actual gain divided by the possible gain, were slightly better in the treatment group than in the control group, but the difference was statistically non-significant. An Analysis of Covariance (ANCOVA) statistic with pretest scores as the covariate was conducted to test for significant differences between the treatment and control groups on the posttest. A second ANCOVA was used to determine the significance of differences between the treatment and control groups on the posttest scores, after controlling for sex, GPA, academic status, experience with clickers, and instructional style. The results indicated a small increase in learning gains but these were not statistically significant. The data did not support an increase in learning based on the use of the CPS technology. This study adds to the body of research that questions whether CPS technology merits classroom adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate the impact of alcohol use, which is widespread in human immunodeficiency virus (HIV)+ individuals, on highly active antiretroviral therapy (HAART)-associated immune and cognitive improvements and the relationship between those two responses. Methods: In a case-control longitudinal study, thymic volume, cognition, and immune responses were evaluated at baseline and after 6 months therapy in HIV+ and HIV- controls. Cognitive performance was evaluated using the HIV Dementia Score (HDS) and the California Verbal Learning Test (CVLT). Results: Prior to HAART, thymic volume varied considerably from 2.7 to 29.3 cm3 (11 ± 7.2 cm3). Thymic volume at baseline showed a significantly inverse correlation with the patient’s number of years of drinking (r2 = 0.207; p < 0.01), as well as HDS and the CVLT scores in both HIV-infected (r2 = 0.37, p = 0.03) and noninfected (r2 = 0.8, p = 0.01). HIV-infected individuals with a small thymic volume scored in the demented range, as compared with those with a larger thymus (7 ± 2.7 vs. 12 ± 2.3, p = 0.005). After HAART, light/moderate drinkers exhibited thymus size twice that of heavy drinkers (14.8 ± 10.4 vs. 6.9 ± 3.3 cm3). Conclusions: HAART-associated increases of thymus volume appear to be negatively affected by alcohol consumption and significantly related to their cognitive status. This result could have important clinical implications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an overcapacity world, where the customers can choose from many similar products to satisfy their needs, enterprises are looking for new approaches and tools that can help them not only to maintain, but also to increase their competitive edge. Innovation, flexibility, quality, and service excellence are required to, at the very least, survive the on-going transition that industry is experiencing from mass production to mass customization. In order to help these enterprises, this research develops a Supply Chain Capability Maturity Model named S(CM)2. The Supply Chain Capability Maturity Model is intended to model, analyze, and improve the supply chain management operations of an enterprise. The Supply Chain Capability Maturity Model provides a clear roadmap for enterprise improvement, covering multiple views and abstraction levels of the supply chain, and provides tools to aid the firm in making improvements. The principal research tool applied is the Delphi method, which systematically gathered the knowledge and experience of eighty eight experts in Mexico. The model is validated using a case study and interviews with experts in supply chain management. The resulting contribution is a holistic model of the supply chain integrating multiple perspectives, and providing a systematic procedure for the improvement of a company’s supply chain operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermally driven liquid-desiccant air-conditioners (LDAC) are a proven but still developing technology. LDACs can use a solar thermal system to reduce the operational cost and environmental impact of the system by reducing the amount of fuel (e.g. natural gas, propane, etc.) used to drive the system. LDACs also have a key benefit of being able to store energy in the form of concentrated desiccant storage. TRNSYS simulations were used to evaluate several different methods of improving the thermal and electrical coefficients of performance (COPt and COPe) and the solar fraction (SF) of a LDAC. The study analyzed a typical June to August cooling season in Toronto, Ontario. Utilizing properly sized, high-efficiency pumps increased the COPe to 3.67, an improvement of 55%. A new design, featuring a heat recovery ventilator on the scavenging-airstream and an energy recovery ventilator on the process-airstream, increased the COPt to 0.58, an improvement of 32%. This also improved the SF slightly to 54%, an increase of 8%. A new TRNSYS TYPE was created to model a stratified desiccant storage tank. Different volumes of desiccant were tested with a range of solar array system sizes. The largest storage tank coupled with the largest solar thermal array showed improvements of 64% in SF, increasing the value to 82%. The COPe was also improved by 17% and the COPt by 9%. When combining the heat recovery systems and the desiccant storage systems, the simulation results showed a 78% increase in COPe and 30% increase in COPt. A 77% improvement in SF and a 17% increase in total cooling rate were also predicted by the simulation. The total thermal energy consumed was 10% lower and the electrical consumption was 34% lower. The amount of non-renewable energy needed from the natural gas boiler was 77% lower. Comparisons were also made between LDACs and vapour-compression (VC) systems. Dependent on set-up, LDACs provided higher latent cooling rates and reduced electrical power consumption. Negatively, a thermal input was required for the LDAC systems but not for the VC systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social Enterprises (SEs) are normally micro and small businesses that trade to tackle social problems, and to improve communities, people’s life chances, and the environment. Thus, their importance to society and economies is increasing. However, there is still a need for more understanding of how these organisations operate, perform, innovate and scale-up. This knowledge is crucial to design and provide accurate strategies to enhance the sector and increase its impact and coverage. Obtaining this understanding is the main driver of this paper, which follows the theoretical lens of the Knowledge-based View (KBV) theory to develop and assess empirically a novel model for knowledge management capabilities (KMCs) development that improves performance of SEs. The empirical assessment consisted of a quantitative study with 432 owners and senior members of SEs in UK, underpinned by 21 interviews. The findings demonstrate how particular organisational characteristics of SEs, the external conditions in which they operate, and informal knowledge management activities, have created overall improvements in their performance of up to 20%, based on a year-to-year comparison, including innovation and creation of social and environmental value. These findings elucidate new perspectives that can contribute not only to SEs and SE supporters, but also to other firms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximately half of the houses in Northern Ireland were built before any form of minimum thermal specification or energy efficiency standard was enforced. Furthermore, 44% of households are categorised as being in fuel poverty; spending more than 10% of the household income to heat the house to bring it to an acceptable level of thermal comfort. To bring existing housing stock up to an acceptable standard, retrofitting for improving the energy efficiency is essential and it is also necessary to study the effectiveness of such improvements in future climate scenarios. This paper presents the results from a year-long performance monitoring of two houses that have undergone retrofits to improve energy efficiency. Using wireless sensor technology internal temperature, humidity, external weather, household gas and electricity usage were monitored for a year. Simulations using IES-VE dynamic building modelling software were calibrated using the monitoring data to ASHARE Guideline 14 standards. The energy performance and the internal environment of the houses were then assessed for current and future climate scenarios and the results show that there is a need for a holistic balanced strategy for retrofitting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past, many papers have been presented which show that the coating of cutting tools often yields decreased wear rates and reduced coefficients of friction. Although different theories are proposed, covering areas such as hardness theory, diffusion barrier theory, thermal barrier theory, and reduced friction theory, most have not dealt with the question of how and why the coating of tool substrates with hard materials such as Titanium Nitride (TiN), Titanium Carbide (TiC) and Aluminium Oxide (Al203) transforms the performance and life of cutting tools. This project discusses the complex interrelationship that encompasses the thermal barrier function and the relatively low sliding friction coefficient of TiN on an undulating tool surface, and presents the result of an investigation into the cutting characteristics and performance of EDMed surface-modified carbide cutting tool inserts. The tool inserts were coated with TiN by the physical vapour deposition (PVD) method. PVD coating is also known as Ion-plating which is the general term of the coating method in which the film is created by attracting ionized metal vapour in this the metal was Titanium and ionized gas onto negatively biased substrate surface. Coating by PVD was chosen because it is done at a temperature of not more than 5000C whereas chemical Vapour Deposition CVD process is done at very high temperature of about 8500C and in two stages of heating up the substrates. The high temperatures involved in CVD affects the strength of the (tool) substrates. In this study, comparative cutting tests using TiN-coated control specimens with no EDM surface structures and TiN-coated EDMed tools with a crater-like surface topography were carried out on mild steel grade EN-3. Various cutting speeds were investigated, up to an increase of 40% of the tool manufacturer’s recommended speed. Fifteen minutes of cutting were carried out for each insert at the speeds investigated. Conventional tool inserts normally have a tool life of approximately 15 minutes of cutting. After every five cuts (passes) microscopic pictures of the tool wear profiles were taken, in order to monitor the progressive wear on the rake face and on the flank of the insert. The power load was monitored for each cut taken using an on-board meter on the CNC machine to establish the amount of power needed for each stage of operation. The spindle drive for the machine is an 11 KW/hr motor. Results obtained confirmed the advantages of cutting at all speeds investigated using EDMed coated inserts, in terms of reduced tool wear and low power loads. Moreover, the surface finish on the workpiece was consistently better for the EDMed inserts. The thesis discusses the relevance of the finite element method in the analysis of metal cutting processes, so that metal machinists can design, manufacture and deliver goods (tools) to the market quickly and on time without going through the hassle of trial and error approach for new products. Improvements in manufacturing technologies require better knowledge of modelling metal cutting processes. Technically the use of computational models has a great value in reducing or even eliminating the number of experiments traditionally used for tool design, process selection, machinability evaluation, and chip breakage investigations. In this work, much interest in theoretical and experimental investigations of metal machining were given special attention. Finite element analysis (FEA) was given priority in this study to predict tool wear and coating deformations during machining. Particular attention was devoted to the complicated mechanisms usually associated with metal cutting, such as interfacial friction; heat generated due to friction and severe strain in the cutting region, and high strain rates. It is therefore concluded that Roughened contact surface comprising of peaks and valleys coated with hard materials (TiN) provide wear-resisting properties as the coatings get entrapped in the valleys and help reduce friction at chip-tool interface. The contributions to knowledge: a. Relates to a wear-resisting surface structure for application in contact surfaces and structures in metal cutting and forming tools with ability to give wear-resisting surface profile. b. Provide technique for designing tool with roughened surface comprising of peaks and valleys covered in conformal coating with a material such as TiN, TiC etc which is wear-resisting structure with surface roughness profile compose of valleys which entrap residual coating material during wear thereby enabling the entrapped coating material to give improved wear resistance. c. Provide knowledge for increased tool life through wear resistance, hardness and chemical stability at high temperatures because of reduced friction at the tool-chip and work-tool interfaces due to tool coating, which leads to reduced heat generation at the cutting zones. d. Establishes that Undulating surface topographies on cutting tips tend to hold coating materials longer in the valleys, thus giving enhanced protection to the tool and the tool can cut faster by 40% and last 60% longer than conventional tools on the markets today.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A key driver of Australian sweetpotato productivity improvements and consumer demand has been industry adoption of disease-free planting material systems. On a farm isolated from main Australian sweetpotato areas, virus-free germplasm is annually multiplied, with subsequent 'pathogen-tested' (PT) sweetpotato roots shipped to commercial Australian sweetpotato growers. They in turn plant their PT roots into specially designated plant beds, commencing in late winter. From these beds, they cut sprouts as the basis for their commercial fields. Along with other intense agronomic practices, this system enables Australian producers to achieve worldRSQUOs highest commercial yields (per hectare) of premium sweetpotatoes. Their industry organisation, ASPG (Australian Sweetpotato Growers Inc.), has identified productivity of mother plant beds as a key driver of crop performance. Growers and scientists are currently collaborating to investigate issues such as catastrophic plant beds losses; optimisation of irrigation and nutrient addition; rapidity and uniformity of initial plant bed harvests; optimal plant bed harvest techniques; virus re-infection of plant beds; and practical longevity of plant beds. A survey of 50 sweetpotato growers in Queensland and New South Wales identified a substantial diversity in current plant bed systems, apparently influenced by growing district, scale of operation, time of planting, and machinery/labour availability. Growers identified key areas for plant bed research as: optimising the size and grading specifications of PT roots supplied for the plant beds; change in sprout density, vigour and performance through sequential cuttings of the plant bed; optimal height above ground level to cut sprouts to maximise commercial crop and plant bed performance; and use of structures and soil amendments in plant bed systems. Our ongoing multi-disciplinary research program integrates detailed agronomic experiments, grower adaptive learning sites, product quality and consumer research, to enhance industry capacity for inspired innovation and commercial, sustainable practice change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mestrado em Gestão e Estratégia Industrial

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the multi-core CPU world, transactional memory (TM)has emerged as an alternative to lock-based programming for thread synchronization. Recent research proposes the use of TM in GPU architectures, where a high number of computing threads, organized in SIMT fashion, requires an effective synchronization method. In contrast to CPUs, GPUs offer two memory spaces: global memory and local memory. The local memory space serves as a shared scratch-pad for a subset of the computing threads, and it is used by programmers to speed-up their applications thanks to its low latency. Prior work from the authors proposed a lightweight hardware TM (HTM) support based in the local memory, modifying the SIMT execution model and adding a conflict detection mechanism. An efficient implementation of these features is key in order to provide an effective synchronization mechanism at the local memory level. After a quick description of the main features of our HTM design for GPU local memory, in this work we gather together a number of proposals designed with the aim of improving those mechanisms with high impact on performance. Firstly, the SIMT execution model is modified to increase the parallelism of the application when transactions must be serialized in order to make forward progress. Secondly, the conflict detection mechanism is optimized depending on application characteristics, such us the read/write sets, the probability of conflict between transactions and the existence of read-only transactions. As these features can be present in hardware simultaneously, it is a task of the compiler and runtime to determine which ones are more important for a given application. This work includes a discussion on the analysis to be done in order to choose the best configuration solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was conducted to determine if the use of the technology known as Classroom Performance System (CPS), specifically referred to as “Clickers”, improves the learning gains of students enrolled in a biology course for science majors. CPS is one of a group of developing technologies adapted for providing feedback in the classroom using a learner-centered approach. It supports and facilitates discussion among students and between them and teachers, and provides for participation by passive students. Advocates, influenced by constructivist theories, claim increased academic achievement. In science teaching, the results have been mixed, but there is some evidence of improvements in conceptual understanding. The study employed a pretest-posttest, non-equivalent groups experimental design. The sample consisted of 226 participants in six sections of a college biology course at a large community college in South Florida with two instructors trained in the use of clickers. Each instructor randomly selected their sections into CPS (treatment) and non-CPS (control) groups. All participants filled out a survey that included demographic data at the beginning of the semester. The treatment group used clicker questions throughout, with discussions as necessary, whereas the control groups answered the same questions as quizzes, similarly engaging in discussion where necessary. The learning gains were assessed on a pre/post-test basis. The average learning gains, defined as the actual gain divided by the possible gain, were slightly better in the treatment group than in the control group, but the difference was statistically non-significant. An Analysis of Covariance (ANCOVA) statistic with pretest scores as the covariate was conducted to test for significant differences between the treatment and control groups on the posttest. A second ANCOVA was used to determine the significance of differences between the treatment and control groups on the posttest scores, after controlling for sex, GPA, academic status, experience with clickers, and instructional style. The results indicated a small increase in learning gains but these were not statistically significant. The data did not support an increase in learning based on the use of the CPS technology. This study adds to the body of research that questions whether CPS technology merits classroom adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nello sport di alto livello l’uso della tecnologia ha raggiunto un ruolo di notevole importanza per l’analisi e la valutazione della prestazione. Negli ultimi anni sono emerse nuove tecnologie e sono migliorate quelle pre-esistenti (i.e. accelerometri, giroscopi e software per l’analisi video) in termini di campionamento, acquisizione dati, dimensione dei sensori che ha permesso la loro “indossabilità” e l’inserimento degli stessi all’interno degli attrezzi sportivi. La tecnologia è sempre stata al servizio degli atleti come strumento di supporto per raggiungere l’apice dei risultati sportivi. Per questo motivo la valutazione funzionale dell’atleta associata all’uso di tecnologie si pone lo scopo di valutare i miglioramenti degli atleti misurando la condizione fisica e/o la competenza tecnica di una determinata disciplina sportiva. L’obiettivo di questa tesi è studiare l’utilizzo delle applicazioni tecnologiche e individuare nuovi metodi di valutazione della performance in alcuni sport acquatici. La prima parte (capitoli 1-5), si concentra sulla tecnologia prototipale chiamata E-kayak e le varie applicazioni nel kayak di velocità. In questi lavori è stata verificata l’attendibilità dei dati forniti dal sistema E-kayak con i sistemi presenti in letteratura. Inoltre, sono stati indagati nuovi parametri utili a comprendere il modello di prestazione del paddler. La seconda parte (capitolo 6), si riferisce all’analisi cinematica della spinta verticale del pallanuotista, attraverso l’utilizzo della video analisi 2D, per l’individuazione delle relazioni Forza-velocità e Potenza-velocità direttamente in acqua. Questo studio pilota, potrà fornire indicazioni utili al monitoraggio e condizionamento di forza e potenza da svolgere direttamente in acqua. Infine la terza parte (capitoli 7-8), si focalizza sull’individuazione della sequenza di Fibonacci (sequenza divina) nel nuoto a stile libero e a farfalla. I risultati di questi studi suggeriscono che il ritmo di nuotata tenuto durante le medie/lunghe distanze gioca un ruolo chiave. Inoltre, il livello di autosomiglianza (self-similarity) aumenta con la tecnica del nuoto.