985 resultados para Coefficient diagram method
Resumo:
Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to obtain richer resources and a broader range of potential uses for a significant number of languages. With the objective of reducing cost by eliminating human intervention, we present a new method towards the automatic merging of resources. This method includes both, the automatic mapping of resources involved to a common format and merging them, once in this format. This paper presents how we have addressed the merging of two verb subcategorization frame lexica for Spanish, but our method will be extended to cover other types of Lexical Resources. The achieved results, that almost replicate human work, demonstrate the feasibility of the approach.
Resumo:
OBJECTIVE: Evaluation of a French translation of the Addiction Severity Index (ASI) in 100 (78 male) alcoholic patients. METHOD: Validity of the instrument was assessed by measuring test-retest and interrater reliability, internal consistency and convergence and discrimination between items and scales. Concurrent validity was assessed by comparing the scores from the ASI with those obtained from three other clinimetric instruments. RESULTS: Test-retest reliability of ASI scores (after a 10-day interval) was good (r = 0.63 to r = 0.95). Interrater reliability was evaluated using six video recordings of patient interviews. Severity ratings assigned by six rates were significantly different (p < .05), but 72% of the ratings assigned by those who viewed the videos were within two points of the interviewer's severity ratings. Cronbach alpha coefficient of internal consistency varied from 0.58 to 0.81 across scales. The average item-to-scale convergent validity (r value) was 0.49 (range 0.0 to 0.84) for composite scores and 0.35 (range 0.00 to 0.68) for severity ratings, whereas discriminant validity was 0.11 on average (range-0.19 to 0.46) for composite scores and 0.12 (range-0.20 to 0.52) for severity ratings. Finally, concurrent validity with the following instruments was assessed: Severity of Alcoholism Dependence Questionnaire (40% shared variance with ASI alcohol scale), Michigan Alcoholism Screening Test (2% shared variance with ASI alcohol scale) and Hamilton Depression Rating Scale (31% shared variance with ASI psychiatric scale). CONCLUSIONS: The Addiction Severity Index covers a large scope of problems encountered among alcoholics and quantifies need for treatment. This French version presents acceptable criteria of reliability and validity.
Resumo:
PURPOSE: Aerodynamic drag plays an important role in performance for athletes practicing sports that involve high-velocity motions. In giant slalom, the skier is continuously changing his/her body posture, and this affects the energy dissipated in aerodynamic drag. It is therefore important to quantify this energy to understand the dynamic behavior of the skier. The aims of this study were to model the aerodynamic drag of alpine skiers in giant slalom simulated conditions and to apply these models in a field experiment to estimate energy dissipated through aerodynamic drag. METHODS: The aerodynamic characteristics of 15 recreational male and female skiers were measured in a wind tunnel while holding nine different skiing-specific postures. The drag and the frontal area were recorded simultaneously for each posture. Four generalized and two individualized models of the drag coefficient were built, using different sets of parameters. These models were subsequently applied in a field study designed to compare the aerodynamic energy losses between a dynamic and a compact skiing technique. RESULTS: The generalized models estimated aerodynamic drag with an accuracy of between 11.00% and 14.28%, and the individualized models estimated aerodynamic drag with an accuracy between 4.52% and 5.30%. The individualized model used for the field study showed that using a dynamic technique led to 10% more aerodynamic drag energy loss than using a compact technique. DISCUSSION: The individualized models were capable of discriminating different techniques performed by advanced skiers and seemed more accurate than the generalized models. The models presented here offer a simple yet accurate method to estimate the aerodynamic drag acting upon alpine skiers while rapidly moving through the range of positions typical to turning technique.
Resumo:
AIM: To prospectively study the intraocular pressure (IOP) lowering effect and safety of the new method of very deep sclerectomy with collagen implant (VDSCI) compared with standard deep sclerectomy with collagen implant (DSCI). METHODS: The trial involved 50 eyes of 48 patients with medically uncontrolled primary and secondary open-angle glaucoma, randomized to undergo either VDSCI procedure (25 eyes) or DSCI procedure (25 eyes). Follow-up examinations were performed before surgery and after surgery at day 1, at week 1, at months 1, 2, 3, 6, 9, 12, 18, and 24 months. Ultrasound biomicroscopy was performed at 3 and 12 months. RESULTS: Mean follow-up period was 18.6+/-5.9 (VDSCI) and 18.9+/-3.6 (DSCI) months (P=NS). Mean preoperative IOP was 22.4+/-7.4 mm Hg for VDSCI and 20.4+/-4.4 mm Hg for DSCI eyes (P=NS). Mean postoperative IOP was 3.9+/-2.3 (VDSCI) and 6.3+/-4.3 (DSCI) (P<0.05) at day 1, and 12.2+/-3.9 (VDSCI) and 13.3+/-3.4 (DSCI) (P=NS) at month 24. At the last visit, the complete success rate (defined as an IOP of < or =18 mm Hg and a percentage drop of at least 20%, achieved without medication) was 57% in VDSCI and 62% in DSCI eyes (P=NS) ultrasound biomicroscopy at 12 months showed a mean volume of the subconjunctival filtering bleb of 3.9+/-4.2 mm3 (VDSCI) and 6.8+/-7.5 mm3 (DSCI) (P=0.426) and 5.2+/-3.6 mm3 (VDSCI) and 5.4+/-2.9 mm3 (DSCI) (P=0.902) for the intrascleral space. CONCLUSIONS: Very deep sclerectomy seems to provide stable and good control of IOP at 2 years of follow-up with few postoperative complications similar to standard deep sclerectomy with the collagen implant.
Resumo:
BACKGROUND: Radiation dose exposure is of particular concern in children due to the possible harmful effects of ionizing radiation. The adaptive statistical iterative reconstruction (ASIR) method is a promising new technique that reduces image noise and produces better overall image quality compared with routine-dose contrast-enhanced methods. OBJECTIVE: To assess the benefits of ASIR on the diagnostic image quality in paediatric cardiac CT examinations. MATERIALS AND METHODS: Four paediatric radiologists based at two major hospitals evaluated ten low-dose paediatric cardiac examinations (80 kVp, CTDI(vol) 4.8-7.9 mGy, DLP 37.1-178.9 mGy·cm). The average age of the cohort studied was 2.6 years (range 1 day to 7 years). Acquisitions were performed on a 64-MDCT scanner. All images were reconstructed at various ASIR percentages (0-100%). For each examination, radiologists scored 19 anatomical structures using the relative visual grading analysis method. To estimate the potential for dose reduction, acquisitions were also performed on a Catphan phantom and a paediatric phantom. RESULTS: The best image quality for all clinical images was obtained with 20% and 40% ASIR (p < 0.001) whereas with ASIR above 50%, image quality significantly decreased (p < 0.001). With 100% ASIR, a strong noise-free appearance of the structures reduced image conspicuity. A potential for dose reduction of about 36% is predicted for a 2- to 3-year-old child when using 40% ASIR rather than the standard filtered back-projection method. CONCLUSION: Reconstruction including 20% to 40% ASIR slightly improved the conspicuity of various paediatric cardiac structures in newborns and children with respect to conventional reconstruction (filtered back-projection) alone.
Resumo:
Objective to verify the associations between stress, Coping and Presenteeism in nurses operating on direct assistance to critical and potentially critical patients. Method this is a descriptive, cross-sectional and quantitative study, conducted between March and April 2010 with 129 hospital nurses. The Inventory of stress in nurses, Occupational and Coping Questionnaire Range of Limitations at Work were used. For the analysis, the Kolmogorov-Smirnov test, correlation coefficient of Pearson and Spearman, Chi-square and T-test were applied. Results it was observed that 66.7% of the nurses showed low stress, 87.6% use control strategies for coping stress and 4.84% had decrease in productivity. Direct and meaningful relationships between stress and lost productivity were found. Conclusion stress interferes with the daily life of nurses and impacts on productivity. Although the inability to test associations, the control strategy can minimize the stress, which consequently contributes to better productivity of nurses in the care of critical patients and potentially critical.
O Processo de Raciocínio na Teoria dos Constrangimentos Aplicado numa Instituição de Ensino Superior
Resumo:
A Teoria dos Constrangimentos (TOC) ou das Restrições tem-se mostrado uma valiosa ferramenta de gestão, auxiliando na identificação dos constrangimentos que limitem a capacidade das empresas na prossecução da sua meta. Assim, para as restrições não físicas, a TOC desenvolveu o Processo de Raciocínio que é composto por ferramentas de análise lógica que dão subsídios para o diagnóstico de problemas bem como a formulação de soluções e planos de acção para implementá-las. Neste contexto, este trabalho se propõe a determinar a adequação do Processo de Raciocínio da Teoria dos Constrangimentos numa Instituição de Ensino Superior (IES) como forma de diagnosticar os problemas e propor soluções capazes de permitir a empresa uma melhoria contínua do seu desempenho. Serão avaliadas as ferramentas que compõem o Processo de Raciocínio, que são: Árvore de Realidade Actual, Diagrama de Dispersão de nuvens, Árvore de Realidade Futura, Árvore de Pré-Requisitos e Árvore de Transição. O trabalho inicia-se com uma pesquisa bibliográfica seguida de uma pesquisa de campo e finalizando com uma aplicação do PR numa IES localizada na ilha de São Vicente. A colecta de informações para a análise processou-se mediante a aplicação de um questionário fechado aos discentes e funcionários, questionário aberto aos docentes e entrevista estruturada aos dirigentes, elaborados em coerência com os objectivos que este estudo pretende atingir. Os resultados da aplicação do método aqui apresentado permitiram chegar as conclusões apontadas no capítulo final deste trabalho The Theory of Constraints (TOC) or Restriction Theory has proved to be a valuable management tool, assisting in the identification of constraints that restrict the ability of companies in the pursuit of its goals. Therefore, for the nonphysical constraints, TOC developed the Thinking Process (TP) that consists of logical analysis tools that provide a basis for diagnosing problems and formulating solutions and action plans to implement them. This study aims, thus, to determine the suitability of the use of the Thinking Process of the Theory of Constraints in a Higher Education Institution (HEI) as a way to diagnose problems and propose solutions that enable the firm to continually improve their performance. We will evaluate the tools that make up the TP, which are: Current Reality Tree, Evaporating Cloud Diagram, Future Reality Tree, Prerequisites Tree and Transition Tree. The work starts with a literature review, followed by a field search, and finishes with an application of the reasoning process in a higher education institution, located in São Vicente. The collection of the data for the analysis was processed through the application of a closed questionnaire to students and staff, opened questionnaire to teachers and interviews to management, drafted in line with the objectives that this study aims to reach. The results of the application of the method presented here, allowed us to reach the conclusions drawn in the final chapter of this work.
Resumo:
OBJECTIVE The aim of this study was to present the process of construction and validation of an instrument for evaluating the care provided to people with wounds, to be used with undergraduate nursing students. METHOD Methodological study, with quantitative approach, using the Delphi technique in two rounds, the first with 30 judges and the second with 18. The analysis was made with Kappa coefficient ≥0.80, and content validity index greater than >0.80, also using the Wilcoxon test for comparison of the indices between the rounds. RESULTS It was found that of the 20 categories of the instrument, 18 presented better scores in the second Delphi round. Scores were greater in the second round in seven of the ten evaluation categories. CONCLUSION Based on the evaluation by the judges, a version of the instrument was defined with adequate indices of agreement and validity, which will be able to help in evaluating care of people with cutaneous injury given by undergraduate nursing students.
Resumo:
The trabecular bone score (TBS) is a gray-level textural metric that can be extracted from the two-dimensional lumbar spine dual-energy X-ray absorptiometry (DXA) image. TBS is related to bone microarchitecture and provides skeletal information that is not captured from the standard bone mineral density (BMD) measurement. Based on experimental variograms of the projected DXA image, TBS has the potential to discern differences between DXA scans that show similar BMD measurements. An elevated TBS value correlates with better skeletal microstructure; a low TBS value correlates with weaker skeletal microstructure. Lumbar spine TBS has been evaluated in cross-sectional and longitudinal studies. The following conclusions are based upon publications reviewed in this article: 1) TBS gives lower values in postmenopausal women and in men with previous fragility fractures than their nonfractured counterparts; 2) TBS is complementary to data available by lumbar spine DXA measurements; 3) TBS results are lower in women who have sustained a fragility fracture but in whom DXA does not indicate osteoporosis or even osteopenia; 4) TBS predicts fracture risk as well as lumbar spine BMD measurements in postmenopausal women; 5) efficacious therapies for osteoporosis differ in the extent to which they influence the TBS; 6) TBS is associated with fracture risk in individuals with conditions related to reduced bone mass or bone quality. Based on these data, lumbar spine TBS holds promise as an emerging technology that could well become a valuable clinical tool in the diagnosis of osteoporosis and in fracture risk assessment.
Resumo:
TEIXEIRA, José João Lopes. Departamento de Engenharia Agrícola, Centro de Ciências Agrárias da Universidade Federal do Ceará, Agosto de 2011. Hidrossedimentologia e disponibilidade hídrica da bacia hidrográfica da Barragem de Poilão, Cabo Verde. Orientador: José Carlos de Araújo. Examinadores: George Leite Mamede, Pedro Henrique Augusto Medeiros. O Arquipélago de Cabo Verde, situado na costa ocidental africana, sofre influência do deserto de Saara tornando o clima caraterizado por pluviometria muito baixa e distribuída irregularmente no espaço e no tempo. As chuvas são muito concentradas, gerando grandes escoamentos para o mar. O aumento da disponibilidade hídrica requer além da construção e manutenção de infraestrutura de captação e conservação de águas pluviais, uma gestão eficiente destes recursos. Atualmente, constitui um dos eixos estratégicos da política do estado de Cabo Verde, a captação, armazenamento e mobilização de águas superficiais através de construção de barragens. Estudos do comportamento hidrológico e sedimentológico do reservatório e da sua bacia de contribuição constituem premissas básicas para um ótimo dimensionamento, gestão e monitoramento da referida infraestrutura. É neste sentido que o presente estudo objetivou sistematizar informações hidrológicas e sedimentológicas da bacia hidrográfica da Barragem de Poilão (BP) e apresentar proposta operacional de longo prazo. A área de estudo ocupa 28 km² a montante da Bacia Hidrográfica da Ribeira Seca (BHRS) na Ilha de Santiago. A altitude da bacia varia de 99 m, situada na cota da barragem, até 1394 m. Para o estudo, foram utilizados e sistematizados, série pluviométrica de 1973 a 2010, registos de vazão instantânea do período 1984 a 2000 e registos agroclimáticos da área de estudo (1981 a 2004). Para o preenchimento das falhas tanto dos escoamentos como da descarga sólida em suspensão, foi utilizado o método de curva chave. Para estimativa de produção de sedimentos na bacia, aplicou-se a Equação Universal de Perda de Solo (USLE) e a razão de aporte de sedimentos (SDR). O índice de retenção de sedimentos no reservatório foi estimado pelo método de Brune e a distribuição de sedimento pelo método empírico de redução de área descrito por Borland e Miller e, revisado por Lara. Para gerar e simular curvas de vazão versus garantia foi utilizado código computacional VYELAS, desenvolvido por Araújo e baseado na abordagem de Campos. Também foi avaliada a redução da vazão de retirada do período 2006 a 2026, provocado pelo assoreamento do reservatório. Concluiu-se que em média a precipitação anual é de 323 mm, concentrando-se 73% nos meses de agosto e setembro; a bacia de contribuição apresenta como valor um número de curva (CN) de 76, com abstração inicial (Ia) de 26 mm, coeficiente de escoamento de 19% e uma vazão anual afluente de 1,7 hm³(cv= 0,73); a disponibilidade hídrica para uma garantia de 85% é avaliada em 0,548 hm³/ano e não 0,671 hm³/ano como indica o projeto original. Com uma descarga sólida estimada em 22.185 m³/ano conclui-se que até o ano de 2026, a capacidade do reservatório reduz a uma taxa de 1,8 % ao ano, devido ao assoreamento, provocando uma redução de 41% da disponibilidade hídrica inicial. Nessa altura, as perdas por evaporação e sangria serão da ordem de 81% da vazão afluente de entrada no reservatório. Na base desses resultados se apresentou proposta de operação da BP.
Resumo:
We present a new method for constructing exact distribution-free tests (and confidence intervals) for variables that can generate more than two possible outcomes.This method separates the search for an exact test from the goal to create a non-randomized test. Randomization is used to extend any exact test relating to meansof variables with finitely many outcomes to variables with outcomes belonging to agiven bounded set. Tests in terms of variance and covariance are reduced to testsrelating to means. Randomness is then eliminated in a separate step.This method is used to create confidence intervals for the difference between twomeans (or variances) and tests of stochastic inequality and correlation.
Resumo:
Models incorporating more realistic models of customer behavior, as customers choosing froman offer set, have recently become popular in assortment optimization and revenue management.The dynamic program for these models is intractable and approximated by a deterministiclinear program called the CDLP which has an exponential number of columns. However, whenthe segment consideration sets overlap, the CDLP is difficult to solve. Column generationhas been proposed but finding an entering column has been shown to be NP-hard. In thispaper we propose a new approach called SDCP to solving CDLP based on segments and theirconsideration sets. SDCP is a relaxation of CDLP and hence forms a looser upper bound onthe dynamic program but coincides with CDLP for the case of non-overlapping segments. Ifthe number of elements in a consideration set for a segment is not very large (SDCP) can beapplied to any discrete-choice model of consumer behavior. We tighten the SDCP bound by(i) simulations, called the randomized concave programming (RCP) method, and (ii) by addingcuts to a recent compact formulation of the problem for a latent multinomial-choice model ofdemand (SBLP+). This latter approach turns out to be very effective, essentially obtainingCDLP value, and excellent revenue performance in simulations, even for overlapping segments.By formulating the problem as a separation problem, we give insight into why CDLP is easyfor the MNL with non-overlapping considerations sets and why generalizations of MNL posedifficulties. We perform numerical simulations to determine the revenue performance of all themethods on reference data sets in the literature.