950 resultados para AFT Models for Crash Duration Survival Analysis
Resumo:
This dissertation explores phase I dose-finding designs in cancer trials from three perspectives: the alternative Bayesian dose-escalation rules, a design based on a time-to-dose-limiting toxicity (DLT) model, and a design based on a discrete-time multi-state (DTMS) model. We list alternative Bayesian dose-escalation rules and perform a simulation study for the intra-rule and inter-rule comparisons based on two statistical models to identify the most appropriate rule under certain scenarios. We provide evidence that all the Bayesian rules outperform the traditional ``3+3'' design in the allocation of patients and selection of the maximum tolerated dose. The design based on a time-to-DLT model uses patients' DLT information over multiple treatment cycles in estimating the probability of DLT at the end of treatment cycle 1. Dose-escalation decisions are made whenever a cycle-1 DLT occurs, or two months after the previous check point. Compared to the design based on a logistic regression model, the new design shows more safety benefits for trials in which more late-onset toxicities are expected. As a trade-off, the new design requires more patients on average. The design based on a discrete-time multi-state (DTMS) model has three important attributes: (1) Toxicities are categorized over a distribution of severity levels, (2) Early toxicity may inform dose escalation, and (3) No suspension is required between accrual cohorts. The proposed model accounts for the difference in the importance of the toxicity severity levels and for transitions between toxicity levels. We compare the operating characteristics of the proposed design with those from a similar design based on a fully-evaluated model that directly models the maximum observed toxicity level within the patients' entire assessment window. We describe settings in which, under comparable power, the proposed design shortens the trial. The proposed design offers more benefit compared to the alternative design as patient accrual becomes slower.
Resumo:
There is growing support for the theory that an interaction between the immune and reproductive/endocrine systems underlies the pathogenesis of autoimmune rheumatic diseases. Most of the recent evidence derives from studies of sex hormones and pregnancy in women with systemic lupus. Other than an ameliorative effect of pregnancy, little is known about reproductive factors in relation to rheumatoid arthritis. To elucidate the relationship, a population-based retrospective study was undertaken. Included were 378 female residents of Olmsted County, Minnesota diagnosed with rheumatoid arthritis between 1950 and 1982 (cases) and 325 arthritis-free, married female controls matched to the 324 married cases on birth-year, age at first marriage, and duration of Olmsted County residency. Information of reproductive factors was extracted from the medical records system maintained by the Mayo Clinic.^ Cases had lower fertility rates compared with the female population of Minnesota (rate ratio = 0.86, 95% confidence interval (CI)= 0.80-0.92). Fertility was significantly reduced even prior to the onset of rheumatoid factor positive arthritis. Restricting the comparison to married Olmsted County residents did not alter the results. Further adjustments for time not at risk of conception using survival analysis and proportional hazards modeling only intensified the fertility reduction in the married cases compared with controls. Nulligravidity was more common among cases than controls (odds ratio = 3.16, CI = 1.61-6.20). Independent of fertility, pregnancy had a protective effect against rheumatoid arthritis (odds ratio = 0.31, CI = 0.11-0.89), which was dramatically reversed in the 12 months postpartum (odds ratio = 4.67, CI = 1.50-14.47). Cases were younger at menopause than controls (p $<$ 0.01).^ Small but statistically insignificant associations were observed between rheumatoid arthritis and the following factors: increased frequency of complaints to a physician of infertility; increased frequency of spontaneous abortion, premature birth, and congenital malformations following arthritis onset; and increased prevalence of menopause at arthritis onset. Cases did not differ from controls on age at menarche, duration of pregnancy, or birth weight.^ The findings provide further support for the involvement of the reproductive/endocrine systems in the pathogenesis of autoimmune rheumatic disease. The search for biological mechanisms should be intensified. ^
Resumo:
In several studies of antiretroviral treatment (ART) programs for persons with human immunodeficiency virus infection, investigators have reported that there has been a higher rate of loss to follow-up (LTFU) among patients initiating ART in recent years than among patients who initiated ART during earlier time periods. This finding is frequently interpreted as reflecting deterioration of patient retention in the face of increasing patient loads. However, in this paper we demonstrate by simulation that transient gaps in follow-up could lead to bias when standard survival analysis techniques are applied. We created a simulated cohort of patients with different dates of ART initiation. Rates of ART interruption, ART resumption, and mortality were assumed to remain constant over time, but when we applied a standard definition of LTFU, the simulated probability of being classified LTFU at a particular ART duration was substantially higher in recently enrolled cohorts. This suggests that much of the apparent trend towards increased LTFU may be attributed to bias caused by transient interruptions in care. Alternative statistical techniques need to be used when analyzing predictors of LTFU-for example, using "prospective" definitions of LTFU in place of "retrospective" definitions. Similar considerations may apply when analyzing predictors of LTFU from treatment programs for other chronic diseases.
Resumo:
Background: Current literature suggests a positive influence of additive classical homeopathyon global health and well-being in cancer patients. Besides encouraging case reports, thereis little if any research on long-term survival of patients who obtain homeopathic care duringcancer treatment. Design: Data from cancer patients who had undergone homeopathic treatment complementaryto conventional anti-cancer treatment at the Outpatient Unit for Homeopathy in MalignantDiseases, Medical University Vienna, Department of Medicine I, Vienna, Austria, were collected,described and a retrospective subgroup-analysis with regard to survival time was performed.Patient inclusion criteria were at least three homeopathic consultations, fatal prognosis ofdisease, quantitative and qualitative description of patient characteristics, and survival time. Results: In four years, a total of 538 patients were recorded to have visited the OutpatientUnit Homeopathy in Malignant Diseases, Medical University Vienna, Department of Medicine I, Vienna, Austria. 62.8% of them were women, and nearly 20% had breast cancer. From the 53.7%(n = 287) who had undergone at least three homeopathic consultations within four years, 18.7%(n = 54) fulfilled inclusion criteria for survival analysis. The surveyed neoplasms were glioblas-toma, lung, cholangiocellular and pancreatic carcinomas, metastasized sarcoma, and renal cellcarcinoma. Median overall survival was compared to expert expectations of survival outcomesby specific cancer type and was prolonged across observed cancer entities (p < 0.001). Conclusion: Extended survival time in this sample of cancer patients with fatal prognosis butadditive homeopathic treatment is interesting. However, findings are based on a small sample,and with only limited data available about patient and treatment characteristics. The relationshipbetween homeopathic treatment and survival time requires prospective investigation in largersamples possibly using matched-pair control analysis or randomized trials.
Resumo:
Quantification of protein expression based on immunohistochemistry (IHC) is an important step in clinical diagnoses and translational tissue-based research. Manual scoring systems are used in order to evaluate protein expression based on staining intensities and distribution patterns. However, visual scoring remains an inherently subjective approach. The aim of our study was to explore whether digital image analysis proves to be an alternative or even superior tool to quantify expression of membrane-bound proteins. We analyzed five membrane-binding biomarkers (HER2, EGFR, pEGFR, β-catenin, and E-cadherin) and performed IHC on tumor tissue microarrays from 153 esophageal adenocarcinomas patients from a single center study. The tissue cores were scored visually applying an established routine scoring system as well as by using digital image analysis obtaining a continuous spectrum of average staining intensity. Subsequently, we compared both assessments by survival analysis as an end point. There were no significant correlations with patient survival using visual scoring of β-catenin, E-cadherin, pEGFR, or HER2. In contrast, the results for digital image analysis approach indicated that there were significant associations with disease-free survival for β-catenin, E-cadherin, pEGFR, and HER2 (P = 0.0125, P = 0.0014, P = 0.0299, and P = 0.0096, respectively). For EGFR, there was a greater association with patient survival when digital image analysis was used compared to when visual scoring was (visual: P = 0.0045, image analysis: P < 0.0001). The results of this study indicated that digital image analysis was superior to visual scoring. Digital image analysis is more sensitive and, therefore, better able to detect biological differences within the tissues with greater accuracy. This increased sensitivity improves the quality of quantification.
Resumo:
Alzheimer's disease (AD), the most common form of dementia, is the fifth leading cause of death among U.S. adults aged 65 or older. Most AD patients have shorter life expectancy compared with older people without dementia. This disease has become an enormous challenge in the aging society and is also a global problem. Not only do families of patients with Alzheimer's disease need to pay attention to this problem, but also the healthcare system and society as a whole have to confront. In dementia, functional impairment is associated with basic activities of daily living (ADL) and instrumental activities of daily living (IADL). For patients with Alzheimer's disease, problems typically appear in performing IADL and progress to the inability of managing less complex ADL functions of personal care. Thus, assessment of ADLs can be used for early accurate diagnosis of Alzheimer's disease. It should be useful for patients, caregivers, clinicians, and policy planners to estimate the survival of patients with Alzheimer's disease. However, it is unclear that when making predictions of patient outcome according to their histories, time-dependent covariates will provide us with important information on how changes in a patient's status can effect the survival. In this study, we examined the effect of impaired basic ADL as measured by the Physical Self-Maintenance Scale (PSMS) and utilized a multistate survival analysis approach to estimate the probability of death in the first few years of initial visit for AD patients taking into consideration the possibility of impaired basic ADL. The dataset used in this study was obtained from the Baylor Alzheimer's Disease and Memory Disorders Center (ADMDC). No impaired basic ADL and older age at onset of impaired basic ADL were associated with longer survival. These findings suggest that the occurrence of impaired basic ADL and age at impaired basic ADL could be predictors of survival among patients with Alzheimer's disease. ^
Resumo:
Stomach cancer is the fourth most common cancer in the world, and ranked 16th in the US in 2008. The age-adjusted rates among Hispanics were 2.8 times that of non-Hispanic Whites in 1998-2002. In spite of that, previous research has found that Hispanics with non-cardia adenocarcinoma of the stomach have a slightly better survival than non-Hispanic Whites. However, such previous research did not include a comparison with African-Americans, and it was limited to data released for the years 1973-2000 in the nine original Surveillance, Epidemiology, and End Results Cancer Registries. This finding was interpreted as related to the Hispanic Paradox, a phenomenon that refers to the fact that Hispanics in the USA tend to paradoxically have substantially better health than other ethnic groups in spite of what their aggregate socio-economic indicators would predict. We extended such research to the SEER 17 Registry, 1973-2005, with varying years of diagnosis per registry, and compared the survival of non-cardia adenocarcinoma of the stomach according to ethnicity (Hispanics, non-Hispanic Whites and African-Americans), while controlling for age, gender, marital status, stage of disease and treatment using Cox regression survival analysis. We found that Hispanic ethnicity by itself did not confer an advantage on survival from non-cardia adenocarcinoma of the stomach, but that being born abroad was independently associated with the apparent 'Hispanic Paradox' previously reported, and that such advantage was seen among foreign born persons across all race/ethnic groups.^
Resumo:
Purpose. A descriptive analysis of glioma patients by race was carried out in order to better elucidate potential differences between races in demographics, treatment, characteristics, prognosis and survival. ^ Patients and Methods. Among 1,967 patients ≥ 18 years diagnosed with glioma seen between July 2000 and September 2006 at The University of Texas M.D. Anderson Cancer Center (UTMDACC). Data were collated from the UTMDACC Patient History Database (PHDB) and the UTMDACC Tumor Registry Database (TRDB). Chi-square analysis, uni- /multivariate Cox proportional hazards modeling and survival analysis were used to analyze differences by race. ^ Results. Demographic, treatment and histologic differences exist between races. Though risk differences were seen between races, race was not found to be a significant predictor in multivariate regression analysis after accounting for age, surgery, chemotherapy, radiation, tumor type as stratified by WHO tumor grade. Age was the most consistent predictor in risk for death. Overall survival by race was significantly different (p=0.0049) only in low-grade gliomas after adjustment for age although survival differences were very slight. ^ Conclusion. Among this cohort of glioma patients, age was the strongest predictor for survival. It is likely that survival is more influenced by age, time to treatment, tumor grade and surgical expertise rather than racial differences. However, age at diagnosis, gender ratios, histology and history of cancer differed significantly between race and genetic differences to this effect cannot be excluded. ^
Resumo:
Background. Colorectal cancer (CRC) is the third most commonly diagnosed cancer (excluding skin cancer) in both men and women in the United States, with an estimated 148,810 new cases and 49,960 deaths in 2008 (1). Racial/ethnic disparities have been reported across the CRC care continuum. Studies have documented racial/ethnic disparities in CRC screening (2-9), but only a few studies have looked at these differences in CRC screening over time (9-11). No studies have compared these trends in a population with CRC and without cancer. Additionally, although there is evidence suggesting that hospital factors (e.g. teaching hospital status and NCI designation) are associated with CRC survival (12-16), no studies have sought to explain the racial/ethnic differences in survival by looking at differences in socio-demographics, tumor characteristics, screening, co-morbidities, treatment, as well as hospital characteristics. ^ Objectives and Methods. The overall goals of this dissertation were to describe the patterns and trends of racial/ethnic disparities in CRC screening (i.e. fecal occult blood test (FOBT), sigmoidoscopy (SIG) and colonoscopy (COL)) and to determine if racial/ethnic disparities in CRC survival are explained by differences in socio-demographic, tumor characteristics, screening, co-morbidities, treatment, and hospital factors. These goals were accomplished in a two-paper format.^ In Paper 1, "Racial/Ethnic Disparities and Trends in Colorectal Cancer Screening in Medicare Beneficiaries with Colorectal Cancer and without Cancer in SEER Areas, 1992-2002", the study population consisted of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and 62,917 Medicare beneficiaries without cancer during the same time period. Both cohorts were aged 67 to 89 years and resided in 16 Surveillance, Epidemiology and End Results (SEER) regions of the United States. Screening procedures between 6 months and 3 years prior to the date of diagnosis for CRC patients and prior to the index date for persons without cancer were identified in Medicare claims. The crude and age-gender-adjusted percentages and odds ratios of receiving FOBT, SIG, or COL were calculated. Multivariable logistic regression was used to assess race/ethnicity on the odds of receiving CRC screening over time.^ Paper 2, "Racial/Ethnic Disparities in Colorectal Cancer Survival: To what extent are racial/ethnic disparities in survival explained by racial differences in socio-demographics, screening, co-morbidities, treatment, tumor or hospital characteristics", included a cohort of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and residing in 16 SEER regions of the United States which were identified in the SEER-Medicare linked database. Survival was estimated using the Kaplan-Meier method. Cox proportional hazard modeling was used to estimate hazard ratios (HR) of mortality and 95% confidence intervals (95% CI).^ Results. The screening analysis demonstrated racial/ethnic disparities in screening over time among the cohort without cancer. From 1992 to 1995, Blacks and Hispanics were less likely than Whites to receive FOBT (OR=0.75, 95% CI: 0.65-0.87; OR=0.50, 95% CI: 0.34-0.72, respectively) but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.72-0.85; OR=0.67, 95% CI: 0.54-0.75, respectively). Blacks and Hispanics were less likely than Whites to receive SIG from 1992 to 1995 (OR=0.75, 95% CI: 0.57-0.98; OR=0.29, 95% CI: 0.12-0.71, respectively), but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.68-0.93; OR=0.50, 95% CI: 0.35-0.72, respectively).^ The survival analysis showed that Blacks had worse CRC-specific survival than Whites (HR: 1.33, 95% CI: 1.23-1.44), but this was reduced for stages I-III disease after full adjustment for socio-demographic, tumor characteristics, screening, co-morbidities, treatment and hospital characteristics (aHR=1.24, 95% CI: 1.14-1.35). Socioeconomic status, tumor characteristics, treatment and co-morbidities contributed to the reduction in hazard ratios between Blacks and Whites with stage I-III disease. Asians had better survival than Whites before (HR: 0.73, 95% CI: 0.64-0.82) and after (aHR: 0.80, 95% CI: 0.70-0.92) adjusting for all predictors for stage I-III disease. For stage IV, both Asians and Hispanics had better survival than Whites, and after full adjustment, survival improved (aHR=0.73, 95% CI: 0.63-0.84; aHR=0.74, 95% CI: 0.61-0.92, respectively).^ Conclusion. Screening disparities remain between Blacks and Whites, and Hispanics and Whites, but have decreased in recent years. Future studies should explore other factors that may contribute to screening disparities, such as physician recommendations and language/cultural barriers in this and younger populations.^ There were substantial racial/ethnic differences in CRC survival among older Whites, Blacks, Asians and Hispanics. Co-morbidities, SES, tumor characteristics, treatment and other predictor variables contributed to, but did not fully explain the CRC survival differences between Blacks and Whites. Future research should examine the role of quality of care, particularly the benefit of treatment and post-treatment surveillance, in racial disparities in survival.^
Resumo:
Head and Neck Squamous Cell Carcinoma (HNSCC) is the sixth common malignancy in the world, with high rates of developing second primary malignancy (SPM) and moderately low survival rates. This disease has become an enormous challenge in the cancer research and treatments. For HNSCC patients, a highly significant cause of post-treatment mortality and morbidity is the development of SPM. Hence, assessment of predicting the risk for the development of SPM would be very helpful for patients, clinicians and policy makers to estimate the survival of patients with HNSCC. In this study, we built a prognostic model to predict the risk of developing SPM in patients with newly diagnosed HNSCC. The dataset used in this research was obtained from The University of Texas MD Anderson Cancer Center. For the first aim, we used stepwise logistic regression to identify the prognostic factors for the development of SPM. Our final model contained cancer site and overall cancer stage as our risk factors for SPM. The Hosmer-Lemeshow test (p-value= 0.15>0.05) showed the final prognostic model fit the data well. The area under the ROC curve was 0.72 that suggested the discrimination ability of our model was acceptable. The internal validation confirmed the prognostic model was a good fit and the final prognostic model would not over optimistically predict the risk of SPM. This model needs external validation by using large data sample size before it can be generalized to predict SPM risk for other HNSCC patients. For the second aim, we utilized a multistate survival analysis approach to estimate the probability of death for HNSCC patients taking into consideration of the possibility of SPM. Patients without SPM were associated with longer survival. These findings suggest that the development of SPM could be a predictor of survival rates among the patients with HNSCC.^
Resumo:
Esta tesis presenta un novedoso marco de referencia para el análisis y optimización del retardo de codificación y descodificación para vídeo multivista. El objetivo de este marco de referencia es proporcionar una metodología sistemática para el análisis del retardo en codificadores y descodificadores multivista y herramientas útiles en el diseño de codificadores/descodificadores para aplicaciones con requisitos de bajo retardo. El marco de referencia propuesto caracteriza primero los elementos que tienen influencia en el comportamiento del retardo: i) la estructura de predicción multivista, ii) el modelo hardware del codificador/descodificador y iii) los tiempos de proceso de cuadro. En segundo lugar, proporciona algoritmos para el cálculo del retardo de codificación/ descodificación de cualquier estructura arbitraria de predicción multivista. El núcleo de este marco de referencia consiste en una metodología para el análisis del retardo de codificación/descodificación multivista que es independiente de la arquitectura hardware del codificador/descodificador, completada con un conjunto de modelos que particularizan este análisis del retardo con las características de la arquitectura hardware del codificador/descodificador. Entre estos modelos, aquellos basados en teoría de grafos adquieren especial relevancia debido a su capacidad de desacoplar la influencia de los diferentes elementos en el comportamiento del retardo en el codificador/ descodificador, mediante una abstracción de su capacidad de proceso. Para revelar las posibles aplicaciones de este marco de referencia, esta tesis presenta algunos ejemplos de su utilización en problemas de diseño que afectan a codificadores y descodificadores multivista. Este escenario de aplicación cubre los siguientes casos: estrategias para el diseño de estructuras de predicción que tengan en consideración requisitos de retardo además del comportamiento tasa-distorsión; diseño del número de procesadores y análisis de los requisitos de velocidad de proceso en codificadores/ descodificadores multivista dado un retardo objetivo; y el análisis comparativo del comportamiento del retardo en codificadores multivista con diferentes capacidades de proceso e implementaciones hardware. ABSTRACT This thesis presents a novel framework for the analysis and optimization of the encoding and decoding delay for multiview video. The objective of this framework is to provide a systematic methodology for the analysis of the delay in multiview encoders and decoders and useful tools in the design of multiview encoders/decoders for applications with low delay requirements. The proposed framework characterizes firstly the elements that have an influence in the delay performance: i) the multiview prediction structure ii) the hardware model of the encoder/decoder and iii) frame processing times. Secondly, it provides algorithms for the computation of the encoding/decoding delay of any arbitrary multiview prediction structure. The core of this framework consists in a methodology for the analysis of the multiview encoding/decoding delay that is independent of the hardware architecture of the encoder/decoder, which is completed with a set of models that particularize this delay analysis with the characteristics of the hardware architecture of the encoder/decoder. Among these models, the ones based in graph theory acquire special relevance due to their capacity to detach the influence of the different elements in the delay performance of the encoder/decoder, by means of an abstraction of its processing capacity. To reveal possible applications of this framework, this thesis presents some examples of its utilization in design problems that affect multiview encoders and decoders. This application scenario covers the following cases: strategies for the design of prediction structures that take into consideration delay requirements in addition to the rate-distortion performance; design of number of processors and analysis of processor speed requirements in multiview encoders/decoders given a target delay; and comparative analysis of the encoding delay performance of multiview encoders with different processing capabilities and hardware implementations.
Resumo:
El propósito de esta tesis fue estudiar el rendimiento ofensivo de los equipos de balonmano de élite cuando se considera el balonmano como un sistema dinámico complejo no lineal. La perspectiva de análisis dinámica dependiente del tiempo fue adoptada para evaluar el rendimiento de los equipos durante el partido. La muestra general comprendió los 240 partidos jugados en la temporada 2011-2012 de la liga profesional masculina de balonmano de España (Liga ASOBAL). En el análisis posterior solo se consideraron los partidos ajustados (diferencia final de goles ≤ 5; n = 142). El estado del marcador, la localización del partido, el nivel de los oponentes y el periodo de juego fueron incorporados al análisis como variables situacionales. Tres estudios compusieron el núcleo de la tesis. En el primer estudio, analizamos la coordinación entre las series temporales que representan el proceso goleador a lo largo del partido de cada uno de los dos equipos que se enfrentan. Autocorrelaciones, correlaciones cruzadas, doble media móvil y transformada de Hilbert fueron usadas para el análisis. El proceso goleador de los equipos presentó una alta consistencia a lo largo de todos los partidos, así como fuertes modos de coordinación en fase en todos los contextos de juego. Las únicas diferencias se encontraron en relación al periodo de juego. La coordinación en los procesos goleadores de los equipos fue significativamente menor en el 1er y 2º periodo (0–10 min y 10–20 min), mostrando una clara coordinación creciente a medida que el partido avanzaba. Esto sugiere que son los 20 primeros minutos aquellos que rompen los partidos. En el segundo estudio, analizamos los efectos temporales (efecto inmediato, a corto y a medio plazo) de los tiempos muertos en el rendimiento goleador de los equipos. Modelos de regresión lineal múltiple fueron empleados para el análisis. Los resultados mostraron incrementos de 0.59, 1.40 y 1.85 goles para los periodos que comprenden la primera, tercera y quinta posesión de los equipos que pidieron el tiempo muerto. Inversamente, se encontraron efectos significativamente negativos para los equipos rivales, con decrementos de 0.50, 1.43 y 2.05 goles en los mismos periodos respectivamente. La influencia de las variables situacionales solo se registró en ciertos periodos de juego. Finalmente, en el tercer estudio, analizamos los efectos temporales de las exclusiones de los jugadores sobre el rendimiento goleador de los equipos, tanto para los equipos que sufren la exclusión (inferioridad numérica) como para los rivales (superioridad numérica). Se emplearon modelos de regresión lineal múltiple para el análisis. Los resultados mostraron efectos negativos significativos en el número de goles marcados por los equipos con un jugador menos, con decrementos de 0.25, 0.40, 0.61, 0.62 y 0.57 goles para los periodos que comprenden el primer, segundo, tercer, cuarto y quinto minutos previos y posteriores a la exclusión. Para los rivales, los resultados mostraron efectos positivos significativos, con incrementos de la misma magnitud en los mismos periodos. Esta tendencia no se vio afectada por el estado del marcador, localización del partido, nivel de los oponentes o periodo de juego. Los incrementos goleadores fueron menores de lo que se podría esperar de una superioridad numérica de 2 minutos. Diferentes teorías psicológicas como la paralización ante situaciones de presión donde se espera un gran rendimiento pueden ayudar a explicar este hecho. Los últimos capítulos de la tesis enumeran las conclusiones principales y presentan diferentes aplicaciones prácticas que surgen de los tres estudios. Por último, se presentan las limitaciones y futuras líneas de investigación. ABSTRACT The purpose of this thesis was to investigate the offensive performance of elite handball teams when considering handball as a complex non-linear dynamical system. The time-dependent dynamic approach was adopted to assess teams’ performance during the game. The overall sample comprised the 240 games played in the season 2011-2012 of men’s Spanish Professional Handball League (ASOBAL League). In the subsequent analyses, only close games (final goal-difference ≤ 5; n = 142) were considered. Match status, game location, quality of opposition, and game period situational variables were incorporated into the analysis. Three studies composed the core of the thesis. In the first study, we analyzed the game-scoring coordination between the time series representing the scoring processes of the two opposing teams throughout the game. Autocorrelation, cross-correlation, double moving average, and Hilbert transform were used for analysis. The scoring processes of the teams presented a high consistency across all the games as well as strong in-phase modes of coordination in all the game contexts. The only differences were found when controlling for the game period. The coordination in the scoring processes of the teams was significantly lower for the 1st and 2nd period (0–10 min and 10–20 min), showing a clear increasing coordination behavior as the game progressed. This suggests that the first 20 minutes are those that break the game-scoring. In the second study, we analyzed the temporal effects (immediate effect, short-term effect, and medium-term effect) of team timeouts on teams’ scoring performance. Multiple linear regression models were used for the analysis. The results showed increments of 0.59, 1.40 and 1.85 goals for the periods within the first, third and fifth timeout ball possessions for the teams that requested the timeout. Conversely, significant negative effects on goals scored were found for the opponent teams, with decrements of 0.59, 1.43 and 2.04 goals for the same periods, respectively. The influence of situational variables on the scoring performance was only registered in certain game periods. Finally, in the third study, we analyzed the players’ exclusions temporal effects on teams’ scoring performance, for the teams that suffer the exclusion (numerical inferiority) and for the opponents (numerical superiority). Multiple linear regression models were used for the analysis. The results showed significant negative effects on the number of goals scored for the teams with one less player, with decrements of 0.25, 0.40, 0.61, 0.62, and 0.57 goals for the periods within the previous and post one, two, three, four and five minutes of play. For the opponent teams, the results showed positive effects, with increments of the same magnitude in the same game periods. This trend was not affected by match status, game location, quality of opposition, or game period. The scoring increments were smaller than might be expected from a 2-minute numerical playing superiority. Psychological theories such as choking under pressure situations where good performance is expected could contribute to explain this finding. The final chapters of the thesis enumerate the main conclusions and underline the main practical applications that arise from the three studies. Lastly, limitations and future research directions are described.
Resumo:
A análise de dados de sobrevivência tem sido tradicionalmente baseada no modelo de regressão de Cox (COX, 1972). No entanto, a suposição de taxas de falha proporcionais assumida para esse modelo pode não ser atendida em diversas situações práticas. Essa restrição do modelo de Cox tem gerado interesse em abordagens alternativas, dentre elas os modelos dinâmicos que permitem efeito das covariáveis variando no tempo. Neste trabalho, foram revisados os principais modelos de sobrevivência dinâmicos com estrutura aditiva e multiplicativa nos contextos não paramétrico e semiparamétrico. Métodos gráficos baseados em resíduos foram apresentados com a finalidade de avaliar a qualidade de ajuste desses modelos. Uma versão tempo-dependente da área sob a curva ROC, denotada por AUC(t), foi proposta com a finalidade de avaliar e comparar a qualidade de predição entre modelos de sobrevivência com estruturas aditiva e multiplicativa. O desempenho da AUC(t) foi avaliado por meio de um estudo de simulação. Dados de três estudos descritos na literatura foram também analisados para ilustrar ou complementar os cenários que foram considerados no estudo de simulação. De modo geral, os resultados obtidos indicaram que os métodos gráficos apresentados para avaliar a adequação dos modelos em conjunto com a AUC(t) se constituem em um conjunto de ferramentas estatísticas úteis para o próposito de avaliar modelos de sobrevivência dinâmicos nos contextos não paramétrico e semiparamétrico. Além disso, a aplicação desse conjunto de ferramentas em alguns conjuntos de dados evidenciou que se, por um lado, os modelos dinâmicos são atrativos por permitirem covariáveis tempo-dependentes, por outro lado podem não ser apropriados para todos os conjuntos de dados, tendo em vista que estimação pode apresentar restrições para alguns deles.
Resumo:
INTRODUÇÃO: Leishmaniose visceral (LV) é uma doença negligenciada que afeta milhões de pessoas no mundo e que constitui um grave problema de saúde pública. OBJETIVOS: Descrever no tempo e no espaço, a dispersão de Lutzomyia longipalpis e a expansão da LV no estado de São Paulo (SP); identificar fatores associados a estes processos. MÉTODOS: Foram realizados estudos descritivos, ecológicos e de análise de sobrevida. Informações sobre o vetor e os casos foram obtidas na Superintendência de Controle de Endemias e no Sistema de Informações de Agravos de Notificação para o período de 1997 a 2014. A área de estudo foi composta pelos 645 municípios de SP. Foram produzidos mapas temáticos e de fluxo e calcularam-se incidência, mortalidade e letalidade por LV em humanos (LVH). Utilizou-se a técnica de análise de sobrevida (Curvas de Kaplan-Meier e Regressão de Cox) para a identificação de fatores associados à dispersão do vetor e expansão da LV. RESULTADOS: A partir da detecção de Lu. longipalpis em Araçatuba em 1997, deram-se a ocorrência do primeiro caso canino (LVC) (1998) e o primeiro caso humano (LVH) autóctones (1999) em SP. Até 2014, foi detectada a presença do vetor em 173 (26,8 por cento ) municípios, LVC em 108 (16,7 por cento ) e LVH em 84 (13 por cento ). A expansão dos três fenômenos ocorreu no sentido noroeste para sudeste e se deram a velocidades constantes. Na região de São José do Rio Preto, a dispersão do vetor deu-se por vizinhança com municípios anteriormente infestados, a expansão da LV relacionou-se com os municípios sede das microrregiões e a doença ocorreu com maior intensidade nas áreas periféricas dos municípios. A presença da Rodovia Marechal Rondon e a divisa com o Mato Grosso do Sul foram fatores associados à ocorrência dos três eventos, assim como a presença da Rodovia Euclides da Cunha para presença do vetor e casos caninos, e, presença de presídios para casos humanos. CONCLUSÕES: A dispersão do vetor e da LV em SP iniciou-se, a partir de 1997, próximo à divisa com o estado do Mato Grosso do Sul, avançou no sentido noroeste para sudeste, na trajetória da rodovia Marechal Rondon, e ocorreu em progressão aritmética, com as sedes das microrregiões de SP tendo papel preponderante neste processo. A ocorrência autóctone de LVC e LVH iniciou-se na sequência da detecção de Lu. longipalpis em Araçatuba e de seu espalhamento por SP e não a partir dos locais onde anteriormente ele já estava presente. O uso da análise de sobrevida permitiu identificar fatores associados à dispersão do vetor e a expansão da LV. Os resultados deste estudo podem ser úteis para aprimorar as atividades de vigilância e controle da LV, no sentido de retardar sua expansão e/ou de mitigar seus efeitos, quando de sua ocorrência.
Resumo:
Transportation Systems Center, Cambridge, Mass.