866 resultados para sequential-tests
Resumo:
The purpose of the present study was to explore the usefulness of the Mexican sequential organ failure assessment (MEXSOFA) score for assessing the risk of mortality for critically ill patients in the ICU. A total of 232 consecutive patients admitted to an ICU were included in the study. The MEXSOFA was calculated using the original SOFA scoring system with two modifications: the PaO2/FiO2 ratio was replaced with the SpO2/FiO2 ratio, and the evaluation of neurologic dysfunction was excluded. The ICU mortality rate was 20.2%. Patients with an initial MEXSOFA score of 9 points or less calculated during the first 24 h after admission to the ICU had a mortality rate of 14.8%, while those with an initial MEXSOFA score of 10 points or more had a mortality rate of 40%. The MEXSOFA score at 48 h was also associated with mortality: patients with a score of 9 points or less had a mortality rate of 14.1%, while those with a score of 10 points or more had a mortality rate of 50%. In a multivariate analysis, only the MEXSOFA score at 48 h was an independent predictor for in-ICU death with an OR = 1.35 (95%CI = 1.14-1.59, P < 0.001). The SOFA and MEXSOFA scores calculated 24 h after admission to the ICU demonstrated a good level of discrimination for predicting the in-ICU mortality risk in critically ill patients. The MEXSOFA score at 48 h was an independent predictor of death; with each 1-point increase, the odds of death increased by 35%.
Resumo:
Although radical nephrectomy alone is widely accepted as the standard of care in localized treatment for renal cell carcinoma (RCC), it is not sufficient for the treatment of metastatic RCC (mRCC), which invariably leads to an unfavorable outcome despite the use of multiple therapies. Currently, sequential targeted agents are recommended for the management of mRCC, but the optimal drug sequence is still debated. This case was a 57-year-old man with clear-cell mRCC who received multiple therapies following his first operation in 2003 and has survived for over 10 years with a satisfactory quality of life. The treatments given included several surgeries, immunotherapy, and sequentially administered sorafenib, sunitinib, and everolimus regimens. In the course of mRCC treatment, well-planned surgeries, effective sequential targeted therapies and close follow-up are all of great importance for optimal management and a satisfactory outcome.
Resumo:
Simplification of highly detailed CAD models is an important step when CAD models are visualized or by other means utilized in augmented reality applications. Without simplification, CAD models may cause severe processing and storage is- sues especially in mobile devices. In addition, simplified models may have other advantages like better visual clarity or improved reliability when used for visual pose tracking. The geometry of CAD models is invariably presented in form of a 3D mesh. In this paper, we survey mesh simplification algorithms in general and focus especially to algorithms that can be used to simplify CAD models. We test some commonly known algorithms with real world CAD data and characterize some new CAD related simplification algorithms that have not been surveyed in previous mesh simplification reviews.
Resumo:
Significant initiatives exist within the global food market to search for new, alternative protein sources with better technological, functional, and nutritional properties. Lima bean (Phaseolus lunatus L.) protein isolate was hydrolyzed using a sequential pepsin-pancreatin enzymatic system. Hydrolysis was performed to produce limited (LH) and extensive hydrolysate (EH), each with different degrees of hydrolysis (DH). The effects of hydrolysis were evaluated in vitro in both hydrolysates based on structural, functional and bioactive properties. Structural properties analyzed by electrophoretic profile indicated that LH showed residual structures very similar to protein isolate (PI), although composed of mixtures of polypeptides that increased hydrophobic surface and denaturation temperature. Functionality of LH was associated with amino acid composition and hydrophobic/hydrophilic balance, which increased solubility at values close to the isoelectric point. Foaming and emulsifying activity index values were also higher than those of PI. EH showed a structure composed of mixtures of polypeptides and peptides of low molecular weight, whose intrinsic hydrophobicity and amino acid profile values were associated with antioxidant capacity, as well as inhibiting angiotensin-converting enzyme. The results obtained indicated the potential of Phaseolus lunatus hydrolysates to be incorporated into foods to improve techno-functional properties and impart bioactive properties.
Resumo:
Electrical road vehicles were common at the begin of the 20th century but internal combustion engines took a victory from electrical motors in road vehicles. The acknowledgement of the environment, and the price and the availability of the crude oil are reasons for the comeback of the electrical vehicles. Advancement in industrial technology and political atmosphere in EU as the directive 20--20--20, which consists of reducing fossil emission, increasing renewable energy and increasing the energy efficiency, have made the electrification popular again. In this thesis tests based on standard ISO 16750--2 electrical loads for electrical equipment in road vehicles are made for Visedo Oy's PowerMASTER M-frame power electronics device. This device is designed for mainly drive trains in mobile work machines and marine vessels but can be used in other application in its power range which also includes road vehicles. The functionality of the device is tested with preliminary tests which act as a framework for the tests based on standards.
Resumo:
Narrative therapy is a postmodern therapy that takes the position that people create self-narratives to make sense of their experiences. To date, narrative therapy has compiled virtually no quantitative and very little qualitative research, leaving gaps in almost all areas of process and outcome. White (2006a), one of the therapy's founders, has recently utilized Vygotsky's (1934/1987) theories of the zone of proximal development (ZPD) and concept formation to describe the process of change in narrative therapy with children. In collaboration with the child client, the narrative therapist formalizes therapeutic concepts and submits them to increasing levels of generalization to create a ZPD. This study sought to determine whether the child's development proceeds through the stages of concept formation over the course of a session, and whether therapists' utterances scaffold this movement. A sequential analysis was used due to its unique ability to measure dynamic processes in social interactions. Stages of concept formation and scaffolding were coded over time. A hierarchical log-linear analysis was performed on the sequential data to develop a model of therapist scaffolding and child concept development. This was intended to determine what patterns occur and whether the stated intent of narrative therapy matches its actual process. In accordance with narrative therapy theory, the log-linear analysis produced a final model with interactions between therapist and child utterances, and between both therapist and child utterances and time. Specifically, the child and youth participants in therapy tended to respond to therapist scaffolding at the corresponding level of concept formation. Both children and youth and therapists also tended to move away from earlier and toward later stages of White's scaffolding conversations map as the therapy session advanced. These findings provide support for White's contention that narrative therapists promote child development by scaffolding child concept formation in therapy.
Resumo:
Flow injection analysis (FIA) was applied to the determination of both chloride ion and mercury in water. Conventional FIA was employed for the chloride study. Investigations of the Fe3 +/Hg(SCN)2/CI-,450 nm spectrophotometric system for chloride determination led to the discovery of an absorbance in the 250-260 nm region when Hg(SCN)2 and CI- are combined in solution, in the absence of iron(III). Employing an in-house FIA system, absorbance observed at 254 nm exhibited a linear relation from essentially 0 - 2000 Jlg ml- 1 injected chloride. This linear range spanning three orders of magnitude is superior to the Fe3+/Hg(SCN)2/CI- system currently employed by laboratories worldwide. The detection limit obtainable with the proposed method was determin~d to be 0.16 Jlg ml- 1 and the relative standard deviation was determined to be 3.5 % over the concentration range of 0-200 Jig ml- 1. Other halogen ions were found to interfere with chloride determination at 254 nm whereas cations did not interfere. This system was successfully applied to the determination of chloride ion in laboratory water. Sequential injection (SI)-FIA was employed for mercury determination in water with the PSA Galahad mercury amalgamation, and Merlin mercury fluorescence detection systems. Initial mercury in air determinations involved injections of mercury saturated air directly into the Galahad whereas mercury in water determinations involved solution delivery via peristaltic pump to a gas/liquid separator, after reduction by stannous chloride. A series of changes were made to the internal hardware and valving systems of the Galahad mercury preconcentrator. Sequential injection solution delivery replaced the continuous peristaltic pump system and computer control was implemented to control and integrate all aspects of solution delivery, sample preconcentration and signal processing. Detection limits currently obtainable with this system are 0.1 ng ml-1 HgO.
Resumo:
This study sought to explore the current state of Grades 4 to 8 science education in Ontario from the perspective of Junior/Intermediate (J/I) teachers. The study’s methodology was a sequential 2-phased mixed methods explanatory design denoted as QUAN (qual) qual. Data were collected from an online survey and follow-up interviews. J/I teachers (N = 219) from 48 school boards in Ontario completed a survey that collected both quantitative and qualitative data. Interviewees were selected from the survey participant population (n = 6) to represent a range of teaching strategies, attitudes toward teaching science, and years of experience. Survey and interview questions inquired about teacher attitudes toward teaching science, academic and professional experiences, teaching strategies, support resources, and instructional time allotments. Quantitative data analyses involved the descriptive statistics and chi-square tests. Qualitative data was coded inductively and deductively. Academic background in science was found to significantly influence teachers’ reported level of capability to teach science. The undergraduate degrees held by J/I science teachers were found to significantly influence their reported levels of capability to teach science. Participants identified a lack of time allocated for science instruction and inadequate equipment and facilities as major limitations on science instruction. Science in schools was reported to be of a “second-tiered” value to language and mathematics. Implications of this study include improving undergraduate and preservice experiences of elementary teachers by supporting their science content knowledge and pedagogical content knowledge.
Resumo:
The conclusion of the article states "it appears that previously learned choices may affect future choices in Y-mazes for cattle. Another area that needs to be researched is the effects of a mildly aversive treatment versus a severely aversive treatment on the tendency of a bovine to resist changing a learned choice".
Resumo:
In the context of multivariate linear regression (MLR) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. In this paper, we propose a general method for constructing exact tests of possibly nonlinear hypotheses on the coefficients of MLR systems. For the case of uniform linear hypotheses, we present exact distributional invariance results concerning several standard test criteria. These include Wilks' likelihood ratio (LR) criterion as well as trace and maximum root criteria. The normality assumption is not necessary for most of the results to hold. Implications for inference are two-fold. First, invariance to nuisance parameters entails that the technique of Monte Carlo tests can be applied on all these statistics to obtain exact tests of uniform linear hypotheses. Second, the invariance property of the latter statistic is exploited to derive general nuisance-parameter-free bounds on the distribution of the LR statistic for arbitrary hypotheses. Even though it may be difficult to compute these bounds analytically, they can easily be simulated, hence yielding exact bounds Monte Carlo tests. Illustrative simulation experiments show that the bounds are sufficiently tight to provide conclusive results with a high probability. Our findings illustrate the value of the bounds as a tool to be used in conjunction with more traditional simulation-based test methods (e.g., the parametric bootstrap) which may be applied when the bounds are not conclusive.
Resumo:
This paper proposes finite-sample procedures for testing the SURE specification in multi-equation regression models, i.e. whether the disturbances in different equations are contemporaneously uncorrelated or not. We apply the technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] to obtain exact tests based on standard LR and LM zero correlation tests. We also suggest a MC quasi-LR (QLR) test based on feasible generalized least squares (FGLS). We show that the latter statistics are pivotal under the null, which provides the justification for applying MC tests. Furthermore, we extend the exact independence test proposed by Harvey and Phillips (1982) to the multi-equation framework. Specifically, we introduce several induced tests based on a set of simultaneous Harvey/Phillips-type tests and suggest a simulation-based solution to the associated combination problem. The properties of the proposed tests are studied in a Monte Carlo experiment which shows that standard asymptotic tests exhibit important size distortions, while MC tests achieve complete size control and display good power. Moreover, MC-QLR tests performed best in terms of power, a result of interest from the point of view of simulation-based tests. The power of the MC induced tests improves appreciably in comparison to standard Bonferroni tests and, in certain cases, outperforms the likelihood-based MC tests. The tests are applied to data used by Fischer (1993) to analyze the macroeconomic determinants of growth.
Resumo:
Dans ce texte, nous revoyons certains développements récents de l’économétrie qui peuvent être intéressants pour des chercheurs dans des domaines autres que l’économie et nous soulignons l’éclairage particulier que l’économétrie peut jeter sur certains thèmes généraux de méthodologie et de philosophie des sciences, tels la falsifiabilité comme critère du caractère scientifique d’une théorie (Popper), la sous-détermination des théories par les données (Quine) et l’instrumentalisme. En particulier, nous soulignons le contraste entre deux styles de modélisation - l’approche parcimonieuse et l’approche statistico-descriptive - et nous discutons les liens entre la théorie des tests statistiques et la philosophie des sciences.
Resumo:
A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.
Resumo:
Dans ce texte, nous analysons les développements récents de l’économétrie à la lumière de la théorie des tests statistiques. Nous revoyons d’abord quelques principes fondamentaux de philosophie des sciences et de théorie statistique, en mettant l’accent sur la parcimonie et la falsifiabilité comme critères d’évaluation des modèles, sur le rôle de la théorie des tests comme formalisation du principe de falsification de modèles probabilistes, ainsi que sur la justification logique des notions de base de la théorie des tests (tel le niveau d’un test). Nous montrons ensuite que certaines des méthodes statistiques et économétriques les plus utilisées sont fondamentalement inappropriées pour les problèmes et modèles considérés, tandis que de nombreuses hypothèses, pour lesquelles des procédures de test sont communément proposées, ne sont en fait pas du tout testables. De telles situations conduisent à des problèmes statistiques mal posés. Nous analysons quelques cas particuliers de tels problèmes : (1) la construction d’intervalles de confiance dans le cadre de modèles structurels qui posent des problèmes d’identification; (2) la construction de tests pour des hypothèses non paramétriques, incluant la construction de procédures robustes à l’hétéroscédasticité, à la non-normalité ou à la spécification dynamique. Nous indiquons que ces difficultés proviennent souvent de l’ambition d’affaiblir les conditions de régularité nécessaires à toute analyse statistique ainsi que d’une utilisation inappropriée de résultats de théorie distributionnelle asymptotique. Enfin, nous soulignons l’importance de formuler des hypothèses et modèles testables, et de proposer des techniques économétriques dont les propriétés sont démontrables dans les échantillons finis.