884 resultados para Performance analysis


Relevância:

60.00% 60.00%

Publicador:

Resumo:

El propósito de este estudio es realizar un análisis computarizado de la marcha en sujetos con amputación transfemoral unilateral, con prótesis endoesquelética y exoesquelética. Método: los participantes fueron dos soldados del Ejercito Nacional Colombiano en edad (25+/-10 años), con amputación transfemoral (AK) unilateral en fase protésica, con un tiempo de amputación mayor de 3 años, por causa traumática, en extremidadizquierda. Se empleó un software de análisis de movimiento denominado Ariel Performance Analysis System (APAS), donde se determinaron variables cinemáticas de la marcha como: el desplazamiento angular de las articulaciones de los miembros inferiores en los diferentes planos y la cadencia de los movimientos; parámetros de marcha como: la longitud del paso, la longitud de la zancada y el ancho del paso; el consumo de energía en el desplazamiento y el tiempo de duración del ciclo de marcha. Los datos arrojados fueron comparados entre sí frente a los parámetros de una marcha normal según la literatura universal. Resultados: Los resultados se presentan en forma de gráficos interactivos, en los cuales se puede apreciar el comportamiento de cada variable medida comparando los resultados frente a una marcha normal. En los amputados AK que usan prótesis por encima de rodilla el patrón de marcha se asocia a pasos cortos bilaterales. El fisioterapeuta debe hacer énfasis en la habilitación de los parámetros de la marcha, con el fin de que estos se asemejen a una marcha normal. Conclusiones: El uso de esta tecnología en la intervención fisioterapéutica con personas amputadas arroja datos exactos en todas las variables del estudio, lo que puede mejorar la habilitación de estas personas en la fase protésica y proporcionarles una reeducación efectiva del patrón de marcha.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Performance analysis has been used for many applications including providing feedback to coaches and players, media applications, scoring of sports performance and scientific research into sports performance. The current study has used performance analysis to generate knowledge relating to the demands of netball competition which has been used in the development of a Netball Specific Fitness Test (NSFT). A modified version of the Bloomfield movement classification was used to provide a detailed analysis of player movement during netball competition. This was considered during a needs analysis when proposing the structure of the NSFT. A series of pilot versions were tested during an evolutionary prototyping process that resulted in the final version of the NSFT, which was found to be representative of movement in netball competition and it distinguished between recreational club players and players of university first team level or above. The test is incremental and involves forward, backward and sideways movement, jumping, lunging, turning and choice reaction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The question "what Monte Carlo models can do and cannot do efficiently" is discussed for some functional spaces that define the regularity of the input data. Data classes important for practical computations are considered: classes of functions with bounded derivatives and Holder type conditions, as well as Korobov-like spaces. Theoretical performance analysis of some algorithms with unimprovable rate of convergence is given. Estimates of computational complexity of two classes of algorithms - deterministic and randomized for both problems - numerical multidimensional integration and calculation of linear functionals of the solution of a class of integral equations are presented. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we deal with performance analysis of Monte Carlo algorithm for large linear algebra problems. We consider applicability and efficiency of the Markov chain Monte Carlo for large problems, i.e., problems involving matrices with a number of non-zero elements ranging between one million and one billion. We are concentrating on analysis of the almost Optimal Monte Carlo (MAO) algorithm for evaluating bilinear forms of matrix powers since they form the so-called Krylov subspaces. Results are presented comparing the performance of the Robust and Non-robust Monte Carlo algorithms. The algorithms are tested on large dense matrices as well as on large unstructured sparse matrices.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Searching for the optimum tap-length that best balances the complexity and steady-state performance of an adaptive filter has attracted attention recently. Among existing algorithms that can be found in the literature, two of which, namely the segmented filter (SF) and gradient descent (GD) algorithms, are of particular interest as they can search for the optimum tap-length quickly. In this paper, at first, we carefully compare the SF and GD algorithms and show that the two algorithms are equivalent in performance under some constraints, but each has advantages/disadvantages relative to the other. Then, we propose an improved variable tap-length algorithm using the concept of the pseudo fractional tap-length (FT). Updating the tap-length with instantaneous errors in a style similar to that used in the stochastic gradient [or least mean squares (LMS)] algorithm, the proposed FT algorithm not only retains the advantages from both the SF and the GD algorithms but also has significantly less complexity than existing algorithms. Both performance analysis and numerical simulations are given to verify the new proposed algorithm.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. This work proposes a fully decentralised algorithm (Epidemic K-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art distributed K-Means algorithms based on sampling methods. The experimental analysis confirms that the proposed algorithm is a practical and accurate distributed K-Means implementation for networked systems of very large and extreme scale.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a new satellite mission to deliver high quality measurements of upper air water vapour. The concept centres around a LiDAR in limb sounding by occultation geometry, designed to operate as a very long path system for differential absorption measurements. We present a preliminary performance analysis with a system sized to send 75 mJ pulses at 25 Hz at four wavelengths close to 935 nm, to up to 5 microsatellites in a counter-rotating orbit, carrying retroreflectors characterized by a reflected beam divergence of roughly twice the emitted laser beam divergence of 15 µrad. This provides water vapour profiles with a vertical sampling of 110 m; preliminary calculations suggest that the system could detect concentrations of less than 5 ppm. A secondary payload of a fairly conventional medium resolution multispectral radiometer allows wide-swath cloud and aerosol imaging. The total weight and power of the system are estimated at 3 tons and 2,700 W respectively. This novel concept presents significant challenges, including the performance of the lasers in space, the tracking between the main spacecraft and the retroreflectors, the refractive effects of turbulence, and the design of the telescopes to achieve a high signal-to-noise ratio for the high precision measurements. The mission concept was conceived at the Alpbach Summer School 2010.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks, such as massively parallel processors and clusters of workstations. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. The lack of scalable and fault tolerant global communication and synchronisation methods in large-scale systems has hindered the adoption of the K-Means algorithm for applications in large networked systems such as wireless sensor networks, peer-to-peer systems and mobile ad hoc networks. This work proposes a fully distributed K-Means algorithm (EpidemicK-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art sampling methods and shows that the proposed method overcomes the limitations of the sampling-based approaches for skewed clusters distributions. The experimental analysis confirms that the proposed algorithm is very accurate and fault tolerant under unreliable network conditions (message loss and node failures) and is suitable for asynchronous networks of very large and extreme scale.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Much is made of the viscerally disturbing qualities embedded in The Texas Chain Saw Massacre - human bodies are traumatised, mutilated and distorted – and the way these are matched by close and often intense access to the performers involved. Graphic violence focused on the body specifically indicates the film as a key contemporary horror text. Yet, for all this closeness to the performers, it soon becomes clear in undertaking close-analysis of the film that access to them is equally characterised by extreme distance, both spatially and cognitively. The issue of distance is particularly striking, not least because of its ramifications on engagement, which throws up various aesthetic and methodological questions concerning performers’ expressive authenticity. This article considers the lack of access to performance in The Texas Chain Saw Massacre, paying particular attention to how this fits in with contemporaneous presentations of performance more generally, as seen in films such as Junior Bonner (Sam Peckinpah, 1972). As part of this investigation I consider the affect of such a severe disruption to access on engagement with, and discussion of, performance. At the heart of this investigation lie methodological considerations of the place of performance analysis in the post-studio period. How can we perceive anything of a character’s interior life, and therefore engage with performers who we fundamentally lack access to? Does such an apparently significant difference in the way performers and their embodiment is treated mean that they can even be thought of as delivering a performance?

Relevância:

60.00% 60.00%

Publicador:

Resumo:

While a growing number of small- and medium-sized enterprises (SMEs) are making use of coaching, little is known about the impact such coaching has within this sector. This study sought to identify the factors that influence managers' decision to engage with coaching, their perceptions of the coaching ‘journey’ and the kinds of benefits accruing from coaching: organisational, personal or both. As part of a mixed methods approach, a survey tool was developed based upon a range of relevant management competencies from the UK's Management Occupational Standards and responses analysed using importance-performance analysis, an approach first used in the marketing sector to evaluate customer satisfaction. Results indicate that coaching had a significant impact on personal attributes such as ‘Managing Self-Cognition’ and ‘Managing Self-Emotional’, whereas the impact on business-oriented attributes was weaker. Managers' choice of coaches with psychotherapeutic rather than non-psychotherapeutic backgrounds was also statistically significant. We conclude that even in the competitive business environment of SMEs, coaching was used as a largely personal, therapeutic intervention rather than to build business-oriented competencies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cellular neural networks (CNNs) have locally connected neurons. This characteristic makes CNNs adequate for hardware implementation and, consequently, for their employment on a variety of applications as real-time image processing and construction of efficient associative memories. Adjustments of CNN parameters is a complex problem involved in the configuration of CNN for associative memories. This paper reviews methods of associative memory design based on CNNs, and provides comparative performance analysis of these approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O presente estudo desenvolve-se numa perspectiva prática, visando à integração de conhecimentos gerados pela pesquisa a atividades assistenciais no hospital geral universitário, dirigindo-se, especificamente, à questão da detecção da depressão. A depressão é um problema de saúde pública no mundo inteiro, transtorno mental de alta prevalência, com elevado custo para os sistemas de saúde. Entre pacientes clínicos e cirúrgicos, hospitalizados, aumenta a complexidade dos tratamentos, implica maior morbidade e mortalidade, importando também no aumento do tempo e dos custos das internações. Por outro lado, a depressão é subdiagnosticada. Este estudo, originado de um projeto cujo objetivo foi criar um instrumento para a detecção de depressão, utilizável na rotina assistencial, a partir da avaliação do desempenho de escalas de rastreamento já existentes, desdobra-se em três artigos. O primeiro, já aceito para publicação em revista indexada internacionalmente, é a retomada de estudos anteriores, realizados no final da década de 1980. É apresentada a comparação da detecção de depressão, realizada por médicos não-psiquiatras e por enfermeiros, no Hospital de Clínicas de Porto Alegre (HCPA), em 1987 e em 2002. O segundo artigo apresenta o processo de construção da nova escala, a partir da seleção de itens de outras escalas já validadas, utilizando modelos logísticos de Rasch. A nova escala, composta por apenas seis itens, exige menos tempo para sua aplicação. O terceiro artigo é um estudo de avaliação de desempenho da nova escala, denominada Escala de Depressão em Hospital Geral (EDHG), realizado em uma outra amostra de pacientes adultos clínicos e cirúrgicos internados no HCPA. O segundo e terceiro artigos já foram encaminhados para publicação internacional. Esses estudos, realizados em unidades de internação clínicas e cirúrgicas do Hospital de Clínicas de Porto Alegre, permitiram as seguintes conclusões: a) comparando-se os achados de 1987 com os de 2002, a prevalência de depressão e o seu diagnóstico, em pacientes adultos clínicos e cirúrgicos internados, mantêm-se nos mesmos níveis; b) foi possível selecionar um conjunto de seis itens, que constituíram a nova Escala de Depressão em Hospital Geral (EDHG), baseando-se no desempenho individual de cada um dos 48 itens componentes de outras três escalas (BDI, CESD e HADS); c) a EDHG apresentou desempenho semelhante aos das escalas que lhe deram origem, usando o PRIME-MD como padrão-ouro, com a vantagem de ter um pequeno número de itens, podendo constituir-se num dispositivo de alerta para detecção de depressão na rotina de hospital geral.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A pesquisa sobre o perfil da indústria de alimentos no período de 1990-95 observou uma superior participação do grupo de produtos de maior valor agregado, com destaque para os laticínios, assim como um aumento do consumo de alimentos em geral propiciado pelo aumento da renda real dos trabalhadores após o Plano Real. Constataram-se, também, ajustes aos novos padrões de competitividade devido à abertura de mercado, com conseqüências na diminuição das margens líquidas e do número de emprego.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the growth of energy consumption worldwide, conventional reservoirs, the reservoirs called "easy exploration and production" are not meeting the global energy demand. This has led many researchers to develop projects that will address these needs, companies in the oil sector has invested in techniques that helping in locating and drilling wells. One of the techniques employed in oil exploration process is the reverse time migration (RTM), in English, Reverse Time Migration, which is a method of seismic imaging that produces excellent image of the subsurface. It is algorithm based in calculation on the wave equation. RTM is considered one of the most advanced seismic imaging techniques. The economic value of the oil reserves that require RTM to be localized is very high, this means that the development of these algorithms becomes a competitive differentiator for companies seismic processing. But, it requires great computational power, that it still somehow harms its practical success. The objective of this work is to explore the implementation of this algorithm in unconventional architectures, specifically GPUs using the CUDA by making an analysis of the difficulties in developing the same, as well as the performance of the algorithm in the sequential and parallel version

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A self-flotator vibrational prototype electromechanical drive for treatment of oil and water emulsion or like emulsion is presented and evaluated. Oil production and refining to obtain derivatives is carried out under arrangements technically referred to as on-shore and off-shore, ie, on the continent and in the sea. In Brazil 80 % of the petroleum production is taken at sea and area of deployment and it cost scale are worrisome. It is associated, oily water production on a large scale, carrier 95% of the potential pollutant of activity whose final destination is the environment medium, terrestrial or maritime. Although diversified set of techniques and water treatment systems are in use or research, we propose an innovative system that operates in a sustainable way without chemical additives, for the good of the ecosystem. Labyrinth adsor-bent is used in metal spirals, and laboratory scale flow. Equipment and process patents are claimed. Treatments were performed at different flow rates and bands often monitored with control systems, some built, other bought for this purpose. Measurements of the levels of oil and grease (OGC) of efluents treaty remained within the range of legal framework under test conditions. Adsorbents were weighed before and after treatment for obtaining oil impregna-tion, the performance goal of vibratory action and treatment as a whole. Treatment technolo-gies in course are referenced, to compare performance, qualitatively and quantitatively. The vibration energy consumption is faced with and without conventional flotation and self-flotation. There are good prospects for the proposed, especially in reducing the residence time, by capillary action system. The impregnation dimensionless parameter was created and confronted with consecrated dimensionless parameters, on the vibrational version, such as Weber number and Froude number in quadratic form, referred to as vibrational criticality. Re-sults suggest limits to the vibration intensity