905 resultados para Classification Methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Evaluate the type of breast compression (gradual or no gradual) that provides less discomfort to the patient. Methods and Materials: The standard projections were simulated [craniocaudal/(CC) and mediolateral-oblique/(MLO)] with the two breast compressions in 90 volunteers women aged between 19 and 86. The women were organised in groups according to the breast density. The intensity of discomfort was evaluated using the scale that have represented several faces (0-10) proposed by Wong Baker in the end of each simulation. It was also applied an interview using focus group to debate the score that were attributed during pain evaluation and to identify the criteria that were considered to do the classification. Results: The women aged between 19-29y (with higher breast density) classified the pain during no gradual compression as 4 and the gradual compression as 2 for both projections. The MLO projection was considered the most uncomfortable. During the focus group interview applied to this group was highlighted that compression did not promoted pain but discomfort. They considered that the high expectations of pain did not correspond to the discomfort that they felt. Similar results were identified for the older women (30-50y; > 50y). Conclusion: The radiographers should considerer the technique for breast compression. The gradual compression was considered for the majority of the women as the most comfortable regardless of breast density. The MLO projection was considered as uncomfortable due to the positioning (axila and inclusion of pectoral muscle) and due to the higher breast compression compared to the CC projection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To propose a method of redistributing ill-defined causes of death (IDCD) based on the investigation of such causes.METHODS In 2010, an evaluation of the results of investigating the causes of death classified as IDCD in accordance with chapter 18 of the International Classification of Diseases (ICD-10) by the Mortality Information System was performed. The redistribution coefficients were calculated according to the proportional distribution of ill-defined causes reclassified after investigation in any chapter of the ICD-10, except for chapter 18, and used to redistribute the ill-defined causes not investigated and remaining by sex and age. The IDCD redistribution coefficient was compared with two usual methods of redistribution: a) Total redistribution coefficient, based on the proportional distribution of all the defined causes originally notified and b) Non-external redistribution coefficient, similar to the previous, but excluding external causes.RESULTS Of the 97,314 deaths by ill-defined causes reported in 2010, 30.3% were investigated, and 65.5% of those were reclassified as defined causes after the investigation. Endocrine diseases, mental disorders, and maternal causes had a higher representation among the reclassified ill-defined causes, contrary to infectious diseases, neoplasms, and genitourinary diseases, with higher proportions among the defined causes reported. External causes represented 9.3% of the ill-defined causes reclassified. The correction of mortality rates by the total redistribution coefficient and non-external redistribution coefficient increased the magnitude of the rates by a relatively similar factor for most causes, contrary to the IDCD redistribution coefficient that corrected the different causes of death with differentiated weights.CONCLUSIONS The proportional distribution of causes among the ill-defined causes reclassified after investigation was not similar to the original distribution of defined causes. Therefore, the redistribution of the remaining ill-defined causes based on the investigation allows for more appropriate estimates of the mortality risk due to specific causes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we present a classification of some of the existing Penalty Methods (denominated the Exact Penalty Methods) and describe some of its limitations and estimated. With these methods we can solve problems of optimization with continuous, discrete and mixing constrains, without requiring continuity, differentiability or convexity. The boarding consists of transforming the original problem, in a sequence of problems without constrains, derivate of the initial, making possible its resolution for the methods known for this type of problems. Thus, the Penalty Methods can be used as the first step for the resolution of constrained problems for methods typically used in by unconstrained problems. The work finishes discussing a new class of Penalty Methods, for nonlinear optimization, that adjust the penalty parameter dynamically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The characteristics of carbon fibre reinforced laminates had widened their use, from aerospace to domestic appliances. A common characteristic is the need of drilling for assembly purposes. It is known that a drilling process that reduces the drill thrust force can decrease the risk of delamination. In this work, delamination assessment methods based on radiographic data are compared and correlated with mechanical test results (bearing test).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constrained and unconstrained Nonlinear Optimization Problems often appear in many engineering areas. In some of these cases it is not possible to use derivative based optimization methods because the objective function is not known or it is too complex or the objective function is non-smooth. In these cases derivative based methods cannot be used and Direct Search Methods might be the most suitable optimization methods. An Application Programming Interface (API) including some of these methods was implemented using Java Technology. This API can be accessed either by applications running in the same computer where it is installed or, it can be remotely accessed through a LAN or the Internet, using webservices. From the engineering point of view, the information needed from the API is the solution for the provided problem. On the other hand, from the optimization methods researchers’ point of view, not only the solution for the problem is needed. Also additional information about the iterative process is useful, such as: the number of iterations; the value of the solution at each iteration; the stopping criteria, etc. In this paper are presented the features added to the API to allow users to access to the iterative process data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Nonlinear Optimization Penalty and Barrier Methods are normally used to solve Constrained Problems. There are several Penalty/Barrier Methods and they are used in several areas from Engineering to Economy, through Biology, Chemistry, Physics among others. In these areas it often appears Optimization Problems in which the involved functions (objective and constraints) are non-smooth and/or their derivatives are not know. In this work some Penalty/Barrier functions are tested and compared, using in the internal process, Derivative-free, namely Direct Search, methods. This work is a part of a bigger project involving the development of an Application Programming Interface, that implements several Optimization Methods, to be used in applications that need to solve constrained and/or unconstrained Nonlinear Optimization Problems. Besides the use of it in applied mathematics research it is also to be used in engineering software packages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this study is the analysis of the dynamical properties of financial data series from 32 worldwide stock market indices during the period 2000–2009 at a daily time horizon. Stock market indices are examples of complex interacting systems for which a huge amount of data exists. The methods and algorithms that have been explored for the description of physical phenomena become an effective background in the analysis of economical data. In this perspective are applied the classical concepts of signal analysis, Fourier transform and methods of fractional calculus. The results reveal classification patterns typical of fractional dynamical systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To investigate the factors related to the granting of preliminary court orders [injunctions] in drug litigations. METHODS A retrospective descriptive study of drug lawsuits in the State of Minas Gerais, Southeastern Brazil, was conducted from October 1999 to 2009. The database consists of 6,112 lawsuits, out of which 6,044 had motions for injunctions and 5,167 included the requisition of drugs. Those with more than one beneficiary were excluded, which totaled 5,072 examined suits. The variables for complete, partial, and suppressed motions were treated as dependent and assessed in relation to those that were independent – lawsuits (year, type, legal representation, defendant, court in which it was filed, adjudication time), drugs (level five of the anatomical therapeutic chemical classification), and diseases (chapter of the International Classification of Diseases). Statistical analyses were performed using the Chi-square test. RESULTS Out of the 5,072 lawsuits with injunctions, 4,184 (82.5%) had the injunctions granted. Granting varied from 95.8% of the total lawsuits in 2004 to 76.9% in 2008. Where there was legal representation, granting exceeded 80.0% and in lawsuits without representation, it did not exceed 66.9%. In public civil actions (89.1%), granting was higher relative to ordinary lawsuits (82.8%) and injunctions (80.1%). Federal courts granted only 68.6% of the injunctions, while the state courts granted 84.8%. Diseases of the digestive system and neoplasms received up to 87.0% in granting, while diseases of the nervous system, mental and behavioral disorders, and diseases of the skin and subcutaneous tissue received granting below 78.6% and showed a high proportion of suspended injunctions (10.9%). Injunctions involving paroxetine, somatropin, and ferrous sulfate drugs were all granted, while less than 54.0% of those involving escitalopram, sodium diclofenac, and nortriptyline were granted. CONCLUSIONS There are significant differences in the granting of injunctions, depending on the procedural and clinical variances. Important trends in the pattern of judicial action were observed, particularly, in the reduced granting [of injunctions] over the period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the application of multidimensional scaling (MDS) analysis to data emerging from noninvasive lung function tests, namely the input respiratory impedance. The aim is to obtain a geometrical mapping of the diseases in a 3D space representation, allowing analysis of (dis)similarities between subjects within the same pathology groups, as well as between the various groups. The adult patient groups investigated were healthy, diagnosed chronic obstructive pulmonary disease (COPD) and diagnosed kyphoscoliosis, respectively. The children patient groups were healthy, asthma and cystic fibrosis. The results suggest that MDS can be successfully employed for mapping purposes of restrictive (kyphoscoliosis) and obstructive (COPD) pathologies. Hence, MDS tools can be further examined to define clear limits between pools of patients for clinical classification, and used as a training aid for medical traineeship.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epidemiological studies have shown the effect of diet on the incidence of chronic diseases; however, proper planning, designing, and statistical modeling are necessary to obtain precise and accurate food consumption data. Evaluation methods used for short-term assessment of food consumption of a population, such as tracking of food intake over 24h or food diaries, can be affected by random errors or biases inherent to the method. Statistical modeling is used to handle random errors, whereas proper designing and sampling are essential for controlling biases. The present study aimed to analyze potential biases and random errors and determine how they affect the results. We also aimed to identify ways to prevent them and/or to use statistical approaches in epidemiological studies involving dietary assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this study is the analysis of the dynamical properties of financial data series from worldwide stock market indexes during the period 2000–2009. We analyze, under a regional criterium, ten main indexes at a daily time horizon. The methods and algorithms that have been explored for the description of dynamical phenomena become an effective background in the analysis of economical data. We start by applying the classical concepts of signal analysis, fractional Fourier transform, and methods of fractional calculus. In a second phase we adopt the multidimensional scaling approach. Stock market indexes are examples of complex interacting systems for which a huge amount of data exists. Therefore, these indexes, viewed from a different perspectives, lead to new classification patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In practice the robotic manipulators present some degree of unwanted vibrations. The advent of lightweight arm manipulators, mainly in the aerospace industry, where weight is an important issue, leads to the problem of intense vibrations. On the other hand, robots interacting with the environment often generate impacts that propagate through the mechanical structure and produce also vibrations. In order to analyze these phenomena a robot signal acquisition system was developed. The manipulator motion produces vibrations, either from the structural modes or from endeffector impacts. The instrumentation system acquires signals from several sensors that capture the joint positions, mass accelerations, forces and moments, and electrical currents in the motors. Afterwards, an analysis package, running off-line, reads the data recorded by the acquisition system and extracts the signal characteristics. Due to the multiplicity of sensors, the data obtained can be redundant because the same type of information may be seen by two or more sensors. Because of the price of the sensors, this aspect can be considered in order to reduce the cost of the system. On the other hand, the placement of the sensors is an important issue in order to obtain the suitable signals of the vibration phenomenon. Moreover, the study of these issues can help in the design optimization of the acquisition system. In this line of thought a sensor classification scheme is presented. Several authors have addressed the subject of the sensor classification scheme. White (White, 1987) presents a flexible and comprehensive categorizing scheme that is useful for describing and comparing sensors. The author organizes the sensors according to several aspects: measurands, technological aspects, detection means, conversion phenomena, sensor materials and fields of application. Michahelles and Schiele (Michahelles & Schiele, 2003) systematize the use of sensor technology. They identified several dimensions of sensing that represent the sensing goals for physical interaction. A conceptual framework is introduced that allows categorizing existing sensors and evaluates their utility in various applications. This framework not only guides application designers for choosing meaningful sensor subsets, but also can inspire new systems and leads to the evaluation of existing applications. Today’s technology offers a wide variety of sensors. In order to use all the data from the diversity of sensors a framework of integration is needed. Sensor fusion, fuzzy logic, and neural networks are often mentioned when dealing with problem of combing information from several sensors to get a more general picture of a given situation. The study of data fusion has been receiving considerable attention (Esteban et al., 2005; Luo & Kay, 1990). A survey of the state of the art in sensor fusion for robotics can be found in (Hackett & Shah, 1990). Henderson and Shilcrat (Henderson & Shilcrat, 1984) introduced the concept of logic sensor that defines an abstract specification of the sensors to integrate in a multisensor system. The recent developments of micro electro mechanical sensors (MEMS) with unwired communication capabilities allow a sensor network with interesting capacity. This technology was applied in several applications (Arampatzis & Manesis, 2005), including robotics. Cheekiralla and Engels (Cheekiralla & Engels, 2005) propose a classification of the unwired sensor networks according to its functionalities and properties. This paper presents a development of a sensor classification scheme based on the frequency spectrum of the signals and on a statistical metrics. Bearing these ideas in mind, this paper is organized as follows. Section 2 describes briefly the robotic system enhanced with the instrumentation setup. Section 3 presents the experimental results. Finally, section 4 draws the main conclusions and points out future work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter analyzes the signals captured during impacts and vibrations of a mechanical manipulator. Eighteen signals are captured and several metrics are calculated between them, such as the correlation, the mutual information and the entropy. A sensor classification scheme based on the multidimensional scaling technique is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To validate a screening instrument using self-reported assessment of frailty syndrome in older adults.METHODS This cross-sectional study used data from the Saúde, Bem-estar e Envelhecimento study conducted in Sao Paulo, SP, Southeastern Brazil. The sample consisted of 433 older adult individuals (≥ 75 years) assessed in 2009. The self-reported instrument can be applied to older adults or their proxy respondents and consists of dichotomous questions directly related to each component of the frailty phenotype, which is considered the gold standard model: unintentional weight loss, fatigue, low physical activity, decreased physical strength, and decreased walking speed. The same classification proposed in the phenotype was utilized: not frail (no component identified); pre-frail (presence of one or two components), and frail (presence of three or more components). Because this is a screening instrument, “process of frailty” was included as a category (pre-frail and frail). Cronbach’s α was used in psychometric analysis to evaluate the reliability and validity of the criterion, the sensitivity, the specificity, as well as positive and negative predictive values. Factor analysis was used to assess the suitability of the proposed number of components.RESULTS Decreased walking speed and decreased physical strength showed good internal consistency (α = 0.77 and 0.72, respectively); however, low physical activity was less satisfactory (α = 0.63). The sensitivity and specificity for identifying pre-frail individuals were 89.7% and 24.3%, respectively, while those for identifying frail individuals were 63.2% and 71.6%, respectively. In addition, 89.7% of the individuals from both the evaluations were identified in the “process of frailty” category.CONCLUSIONS The self-reported assessment of frailty can identify the syndrome among older adults and can be used as a screening tool. Its advantages include simplicity, rapidity, low cost, and ability to be used by different professionals.