857 resultados para importance performance analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The unprecedented increase in competition as well as protectionism in world markets makes it imperative for a country like India to get much more energetically involved in the export business and make the dictum "export and flourish" a really true proposition, as against a somewhat passive "export and perish" approach followed during the last three and a half decades. At present, India needs to evolve new export strategies to cope with the changing international scenario and to ensure a steady improvement in the otherwise sagging export performance. A search for such strategic measures becomes all the more important in view of the all-out efforts of the government for expanding the country's exports to tide over the crippling balance of payment deficits and to generate necessary foreign exchange to meet the import requirements for accelerating the tempo of economic development. The present study is an endeavour in this direction. Taking engineering exports as an example, the study demonstrates alternative ways of understanding indepth export performance analysis and learning lessons for better performance in future

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cerebral glioma is the most prevalent primary brain tumor, which are classified broadly into low and high grades according to the degree of malignancy. High grade gliomas are highly malignant which possess a poor prognosis, and the patients survive less than eighteen months after diagnosis. Low grade gliomas are slow growing, least malignant and has better response to therapy. To date, histological grading is used as the standard technique for diagnosis, treatment planning and survival prediction. The main objective of this thesis is to propose novel methods for automatic extraction of low and high grade glioma and other brain tissues, grade detection techniques for glioma using conventional magnetic resonance imaging (MRI) modalities and 3D modelling of glioma from segmented tumor slices in order to assess the growth rate of tumors. Two new methods are developed for extracting tumor regions, of which the second method, named as Adaptive Gray level Algebraic set Segmentation Algorithm (AGASA) can also extract white matter and grey matter from T1 FLAIR an T2 weighted images. The methods were validated with manual Ground truth images, which showed promising results. The developed methods were compared with widely used Fuzzy c-means clustering technique and the robustness of the algorithm with respect to noise is also checked for different noise levels. Image texture can provide significant information on the (ab)normality of tissue, and this thesis expands this idea to tumour texture grading and detection. Based on the thresholds of discriminant first order and gray level cooccurrence matrix based second order statistical features three feature sets were formulated and a decision system was developed for grade detection of glioma from conventional T2 weighted MRI modality.The quantitative performance analysis using ROC curve showed 99.03% accuracy for distinguishing between advanced (aggressive) and early stage (non-aggressive) malignant glioma. The developed brain texture analysis techniques can improve the physician’s ability to detect and analyse pathologies leading to a more reliable diagnosis and treatment of disease. The segmented tumors were also used for volumetric modelling of tumors which can provide an idea of the growth rate of tumor; this can be used for assessing response to therapy and patient prognosis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El propósito de este estudio es realizar un análisis computarizado de la marcha en sujetos con amputación transfemoral unilateral, con prótesis endoesquelética y exoesquelética. Método: los participantes fueron dos soldados del Ejercito Nacional Colombiano en edad (25+/-10 años), con amputación transfemoral (AK) unilateral en fase protésica, con un tiempo de amputación mayor de 3 años, por causa traumática, en extremidadizquierda. Se empleó un software de análisis de movimiento denominado Ariel Performance Analysis System (APAS), donde se determinaron variables cinemáticas de la marcha como: el desplazamiento angular de las articulaciones de los miembros inferiores en los diferentes planos y la cadencia de los movimientos; parámetros de marcha como: la longitud del paso, la longitud de la zancada y el ancho del paso; el consumo de energía en el desplazamiento y el tiempo de duración del ciclo de marcha. Los datos arrojados fueron comparados entre sí frente a los parámetros de una marcha normal según la literatura universal. Resultados: Los resultados se presentan en forma de gráficos interactivos, en los cuales se puede apreciar el comportamiento de cada variable medida comparando los resultados frente a una marcha normal. En los amputados AK que usan prótesis por encima de rodilla el patrón de marcha se asocia a pasos cortos bilaterales. El fisioterapeuta debe hacer énfasis en la habilitación de los parámetros de la marcha, con el fin de que estos se asemejen a una marcha normal. Conclusiones: El uso de esta tecnología en la intervención fisioterapéutica con personas amputadas arroja datos exactos en todas las variables del estudio, lo que puede mejorar la habilitación de estas personas en la fase protésica y proporcionarles una reeducación efectiva del patrón de marcha.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Performance analysis has been used for many applications including providing feedback to coaches and players, media applications, scoring of sports performance and scientific research into sports performance. The current study has used performance analysis to generate knowledge relating to the demands of netball competition which has been used in the development of a Netball Specific Fitness Test (NSFT). A modified version of the Bloomfield movement classification was used to provide a detailed analysis of player movement during netball competition. This was considered during a needs analysis when proposing the structure of the NSFT. A series of pilot versions were tested during an evolutionary prototyping process that resulted in the final version of the NSFT, which was found to be representative of movement in netball competition and it distinguished between recreational club players and players of university first team level or above. The test is incremental and involves forward, backward and sideways movement, jumping, lunging, turning and choice reaction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The question "what Monte Carlo models can do and cannot do efficiently" is discussed for some functional spaces that define the regularity of the input data. Data classes important for practical computations are considered: classes of functions with bounded derivatives and Holder type conditions, as well as Korobov-like spaces. Theoretical performance analysis of some algorithms with unimprovable rate of convergence is given. Estimates of computational complexity of two classes of algorithms - deterministic and randomized for both problems - numerical multidimensional integration and calculation of linear functionals of the solution of a class of integral equations are presented. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we deal with performance analysis of Monte Carlo algorithm for large linear algebra problems. We consider applicability and efficiency of the Markov chain Monte Carlo for large problems, i.e., problems involving matrices with a number of non-zero elements ranging between one million and one billion. We are concentrating on analysis of the almost Optimal Monte Carlo (MAO) algorithm for evaluating bilinear forms of matrix powers since they form the so-called Krylov subspaces. Results are presented comparing the performance of the Robust and Non-robust Monte Carlo algorithms. The algorithms are tested on large dense matrices as well as on large unstructured sparse matrices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Searching for the optimum tap-length that best balances the complexity and steady-state performance of an adaptive filter has attracted attention recently. Among existing algorithms that can be found in the literature, two of which, namely the segmented filter (SF) and gradient descent (GD) algorithms, are of particular interest as they can search for the optimum tap-length quickly. In this paper, at first, we carefully compare the SF and GD algorithms and show that the two algorithms are equivalent in performance under some constraints, but each has advantages/disadvantages relative to the other. Then, we propose an improved variable tap-length algorithm using the concept of the pseudo fractional tap-length (FT). Updating the tap-length with instantaneous errors in a style similar to that used in the stochastic gradient [or least mean squares (LMS)] algorithm, the proposed FT algorithm not only retains the advantages from both the SF and the GD algorithms but also has significantly less complexity than existing algorithms. Both performance analysis and numerical simulations are given to verify the new proposed algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. This work proposes a fully decentralised algorithm (Epidemic K-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art distributed K-Means algorithms based on sampling methods. The experimental analysis confirms that the proposed algorithm is a practical and accurate distributed K-Means implementation for networked systems of very large and extreme scale.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a new satellite mission to deliver high quality measurements of upper air water vapour. The concept centres around a LiDAR in limb sounding by occultation geometry, designed to operate as a very long path system for differential absorption measurements. We present a preliminary performance analysis with a system sized to send 75 mJ pulses at 25 Hz at four wavelengths close to 935 nm, to up to 5 microsatellites in a counter-rotating orbit, carrying retroreflectors characterized by a reflected beam divergence of roughly twice the emitted laser beam divergence of 15 µrad. This provides water vapour profiles with a vertical sampling of 110 m; preliminary calculations suggest that the system could detect concentrations of less than 5 ppm. A secondary payload of a fairly conventional medium resolution multispectral radiometer allows wide-swath cloud and aerosol imaging. The total weight and power of the system are estimated at 3 tons and 2,700 W respectively. This novel concept presents significant challenges, including the performance of the lasers in space, the tracking between the main spacecraft and the retroreflectors, the refractive effects of turbulence, and the design of the telescopes to achieve a high signal-to-noise ratio for the high precision measurements. The mission concept was conceived at the Alpbach Summer School 2010.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks, such as massively parallel processors and clusters of workstations. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. The lack of scalable and fault tolerant global communication and synchronisation methods in large-scale systems has hindered the adoption of the K-Means algorithm for applications in large networked systems such as wireless sensor networks, peer-to-peer systems and mobile ad hoc networks. This work proposes a fully distributed K-Means algorithm (EpidemicK-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art sampling methods and shows that the proposed method overcomes the limitations of the sampling-based approaches for skewed clusters distributions. The experimental analysis confirms that the proposed algorithm is very accurate and fault tolerant under unreliable network conditions (message loss and node failures) and is suitable for asynchronous networks of very large and extreme scale.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper seeks to explore in depth the ways in which rhetorical strategies are employed in the international accounting standard setting process. The study proposes that rather than simply detailing new accounting requirements, the texts and drafts of accounting standards are artefacts, i.e. deliberately and carefully crafted products, that construct, persuade and encourage certain beliefs and behaviours. The persuasive and constructive strategies are also employed by the constituents submitting comment letters on the regulatory proposals. Consequently, the international accounting standard setting process is an ‘interactive process of meaning making’ (Fairclough, 1989). The study regards accounting as a social construct based on intersubjectivity (Searle, 1995; Davidson, 1990, 1994) and posits language as a constitutive factor in the process (Saussure, 1916; Peirce, 1931-58). This approach to the use of language and the role of rhetoric as a persuasive tool to convince others to our perception of ‘accounting reality’ is supported by the sociological work of Bourdieu (1990, 1991). Bourdieu has drawn our attention to how language becomes used, controlled, reformed and reconstituted by the social agents for the purposes of establishing their dominance. In our study we explore in particular the joint IASB and FASB proposals and subsequent regulations on the scope of consolidation and relevant disclosures that address issues of off-balance sheet financing, a subject that is very timely and of great topical importance. The analysis has revealed sophisticated rhetorical devices used by both the Boards and by the lobbyists. These reflect Aristotelian ethos, pathos and logos. The research demonstrates that those using accounting standards as well as those reading comment letters on the proposals for new standards should be aware of the normative nature of these documents and the subjectivity inherent in the nature of the text.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Much is made of the viscerally disturbing qualities embedded in The Texas Chain Saw Massacre - human bodies are traumatised, mutilated and distorted – and the way these are matched by close and often intense access to the performers involved. Graphic violence focused on the body specifically indicates the film as a key contemporary horror text. Yet, for all this closeness to the performers, it soon becomes clear in undertaking close-analysis of the film that access to them is equally characterised by extreme distance, both spatially and cognitively. The issue of distance is particularly striking, not least because of its ramifications on engagement, which throws up various aesthetic and methodological questions concerning performers’ expressive authenticity. This article considers the lack of access to performance in The Texas Chain Saw Massacre, paying particular attention to how this fits in with contemporaneous presentations of performance more generally, as seen in films such as Junior Bonner (Sam Peckinpah, 1972). As part of this investigation I consider the affect of such a severe disruption to access on engagement with, and discussion of, performance. At the heart of this investigation lie methodological considerations of the place of performance analysis in the post-studio period. How can we perceive anything of a character’s interior life, and therefore engage with performers who we fundamentally lack access to? Does such an apparently significant difference in the way performers and their embodiment is treated mean that they can even be thought of as delivering a performance?

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cellular neural networks (CNNs) have locally connected neurons. This characteristic makes CNNs adequate for hardware implementation and, consequently, for their employment on a variety of applications as real-time image processing and construction of efficient associative memories. Adjustments of CNN parameters is a complex problem involved in the configuration of CNN for associative memories. This paper reviews methods of associative memory design based on CNNs, and provides comparative performance analysis of these approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O presente estudo desenvolve-se numa perspectiva prática, visando à integração de conhecimentos gerados pela pesquisa a atividades assistenciais no hospital geral universitário, dirigindo-se, especificamente, à questão da detecção da depressão. A depressão é um problema de saúde pública no mundo inteiro, transtorno mental de alta prevalência, com elevado custo para os sistemas de saúde. Entre pacientes clínicos e cirúrgicos, hospitalizados, aumenta a complexidade dos tratamentos, implica maior morbidade e mortalidade, importando também no aumento do tempo e dos custos das internações. Por outro lado, a depressão é subdiagnosticada. Este estudo, originado de um projeto cujo objetivo foi criar um instrumento para a detecção de depressão, utilizável na rotina assistencial, a partir da avaliação do desempenho de escalas de rastreamento já existentes, desdobra-se em três artigos. O primeiro, já aceito para publicação em revista indexada internacionalmente, é a retomada de estudos anteriores, realizados no final da década de 1980. É apresentada a comparação da detecção de depressão, realizada por médicos não-psiquiatras e por enfermeiros, no Hospital de Clínicas de Porto Alegre (HCPA), em 1987 e em 2002. O segundo artigo apresenta o processo de construção da nova escala, a partir da seleção de itens de outras escalas já validadas, utilizando modelos logísticos de Rasch. A nova escala, composta por apenas seis itens, exige menos tempo para sua aplicação. O terceiro artigo é um estudo de avaliação de desempenho da nova escala, denominada Escala de Depressão em Hospital Geral (EDHG), realizado em uma outra amostra de pacientes adultos clínicos e cirúrgicos internados no HCPA. O segundo e terceiro artigos já foram encaminhados para publicação internacional. Esses estudos, realizados em unidades de internação clínicas e cirúrgicas do Hospital de Clínicas de Porto Alegre, permitiram as seguintes conclusões: a) comparando-se os achados de 1987 com os de 2002, a prevalência de depressão e o seu diagnóstico, em pacientes adultos clínicos e cirúrgicos internados, mantêm-se nos mesmos níveis; b) foi possível selecionar um conjunto de seis itens, que constituíram a nova Escala de Depressão em Hospital Geral (EDHG), baseando-se no desempenho individual de cada um dos 48 itens componentes de outras três escalas (BDI, CESD e HADS); c) a EDHG apresentou desempenho semelhante aos das escalas que lhe deram origem, usando o PRIME-MD como padrão-ouro, com a vantagem de ter um pequeno número de itens, podendo constituir-se num dispositivo de alerta para detecção de depressão na rotina de hospital geral.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A pesquisa sobre o perfil da indústria de alimentos no período de 1990-95 observou uma superior participação do grupo de produtos de maior valor agregado, com destaque para os laticínios, assim como um aumento do consumo de alimentos em geral propiciado pelo aumento da renda real dos trabalhadores após o Plano Real. Constataram-se, também, ajustes aos novos padrões de competitividade devido à abertura de mercado, com conseqüências na diminuição das margens líquidas e do número de emprego.