941 resultados para Multiple Hypothesis Testing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problem of decentralized sequential detection is studied in this thesis, where local sensors are memoryless, receive independent observations, and no feedback from the fusion center. In addition to traditional criteria of detection delay and error probability, we introduce a new constraint: the number of communications between local sensors and the fusion center. This metric is able to reflect both the cost of establishing communication links as well as overall energy consumption over time. A new formulation for communication-efficient decentralized sequential detection is proposed where the overall detection delay is minimized with constraints on both error probabilities and the communication cost. Two types of problems are investigated based on the communication-efficient formulation: decentralized hypothesis testing and decentralized change detection. In the former case, an asymptotically person-by-person optimum detection framework is developed, where the fusion center performs a sequential probability ratio test based on dependent observations. The proposed algorithm utilizes not only reported statistics from local sensors, but also the reporting times. The asymptotically relative efficiency of proposed algorithm with respect to the centralized strategy is expressed in closed form. When the probabilities of false alarm and missed detection are close to one another, a reduced-complexity algorithm is proposed based on a Poisson arrival approximation. In addition, decentralized change detection with a communication cost constraint is also investigated. A person-by-person optimum change detection algorithm is proposed, where transmissions of sensing reports are modeled as a Poisson process. The optimum threshold value is obtained through dynamic programming. An alternative method with a simpler fusion rule is also proposed, where the threshold values in the algorithm are determined by a combination of sequential detection analysis and constrained optimization. In both decentralized hypothesis testing and change detection problems, tradeoffs in parameter choices are investigated through Monte Carlo simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El estudio tuvo como propósito determinar la efectividad relativa del ABP, comparado con el método tradicional para desarrollar habilidades de resolución de problemas en el aprendizaje de las aplicaciones de la solución de triángulos en el grado 10º de la Institución Educativa El Progreso, de El Carmen de Viboral, Antioquia. La enseñanza-aprendizaje de las matemáticas sustentadas con la estrategia didáctica Aprendizaje Basado en Problemas permite a los estudiantes y docentes aproximarse al conocimiento de una manera similar a como lo hacen los científicos; el primer paso es una situación de duda, perplejidad del estudiante provocada por la Situación Problema planteada por el docente, el segundo un momento de “sugerencias” en las que la mente salta hacia adelante en busca de una posible solución (Dewey, 1933, p. 102). El tercer paso “intelectualización” de la dificultad que se ha percibido para convertirlo en un problema que debe solucionarse (Dewey, 1933, p. 103). La cuarta es “la idea conductora o hipótesis”, las cuales se basan en la formulación de explicaciones sugeridas o soluciones posibles (Dewey, 1933, p. 104). El quinto paso sería el “razonamiento”, consiste en la elaboración racional de una idea que se va desarrollando de acuerdo a las habilidades de cada persona (Dewey, 1933, p. 105). El paso final es la “comprobación de hipótesis” en situaciones reales. Este proceso se evidenció a través de cuatro Situaciones-Problema enfocadas desde un contexto auténtico “la remodelación del parque principal de El Carmen de Viboral” con el objetivo de motivar a los estudiantes para el aprendizaje de algunos conceptos matemáticos y el desarrollo de habilidades de resolución de problemas. La metodología de la investigación fue un diseño cuasi-experimental con grupo experimental compuesto por 38 estudiantes del grado 10º2 y grupo control con 37 estudiantes del grado 10º1. Se empleó como técnica de recolección de la información una prueba pre-test antes del tratamiento y una prueba post-test que se aplicó después del tratamiento a ambos grupos; se aplicó también una escala de satisfacción de los estudiantes con la metodología tradicional en ambos grupos y una escala de satisfacción con la estrategia didáctica ABP sólo al grupo experimental; la observación directa, y el portafolio que evidenciaba todas las construcciones de los estudiantes. La aplicación de la estrategia didáctica experimental se aplicó durante 4 meses, con una intensidad horaria de cuatro horas semanales, tiempo durante el cual se implementaron las cuatro Situaciones-Problema. Se concluyó entre otros aspectos que el 86,5% de los estudiantes encuentran las clases de matemáticas como interesantes, contextualizadas, aplicables y significativas, mientras que antes del tratamiento sólo el 44,4% se encontraba satisfecho con las clases de matemáticas, con una diferencia en cambio de actitud de 42,1% frente a las clases de matemáticas con la metodología tradicional. En el análisis comparativo de adquisición de competencias específicas se demuestra que el grupo experimental demostró ser matemáticamente más competente con respecto al grupo control en todas las competencias evaluadas: capacidad de modelación, inductiva, comunicativa y habilidad procedimental. Además, el proyecto de investigación tuvo un valor agregado: 10 estudiantes tuvieron la oportunidad de conocer más sobre su cultura ceramista mediante el diseño y construcción de mosaicos que los ofreció la casa de la cultura en forma gratuita.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Adult anchovies in the Bay of Biscay perform north to south migration from late winter to early summer for spawning. However, what triggers and drives the geographic shift of the population remains unclear and poorly understood. An individual-based fish model has been implemented to explore the potential mechanisms that control anchovy's movement routes toward its spawning habitats. To achieve this goal, two fish movement behaviors – gradient detection through restricted area search and kinesis – simulated fish response to its dynamic environment. A bioenergetics model was used to represent individual growth and reproduction along the fish trajectory. The environmental forcing (food, temperature) of the model was provided by a coupled physical–biogeochemical model. We followed a hypothesis-testing strategy to actualize a series of simulations using different cues and computational assumptions. The gradient detection behavior was found as the most suitable mechanism to recreate the observed shift of anchovy distribution under the combined effect of sea-surface temperature and zooplankton. In addition, our results suggested that southward movement occurred more actively from early April to middle May following favorably the spatio-temporal evolution of zooplankton and temperature. In terms of fish bioenergetics, individuals who ended up in the southern part of the bay presented better condition based on energy content, proposing the resulting energy gain as an ecological explanation for this migration. The kinesis approach resulted in a moderate performance, producing distribution pattern with the highest spread. Finally, model performance was not significantly affected by changes on the starting date, initial fish distribution and number of particles used in the simulations, whereas it was drastically influenced by the adopted cues.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider an LTE network where a secondary user acts as a relay, transmitting data to the primary user using a decode-and-forward mechanism, transparent to the base-station (eNodeB). Clearly, the relay can decode symbols more reliably if the employed precoder matrix indicators (PMIs) are known. However, for closed loop spatial multiplexing (CLSM) transmit mode, this information is not always embedded in the downlink signal, leading to a need for effective methods to determine the PMI. In this thesis, we consider 2x2 MIMO and 4x4 MIMO downlink channels corresponding to CLSM and formulate two techniques to estimate the PMI at the relay using a hypothesis testing framework. We evaluate their performance via simulations for various ITU channel models over a range of SNR and for different channel quality indicators (CQIs). We compare them to the case when the true PMI is known at the relay and show that the performance of the proposed schemes are within 2 dB at 10% block error rate (BLER) in almost all scenarios. Furthermore, the techniques add minimal computational overhead over existent receiver structure. Finally, we also identify scenarios when using the proposed precoder detection algorithms in conjunction with the cooperative decode-and-forward relaying mechanism benefits the PUE and improves the BLER performance for the PUE. Therefore, we conclude from this that the proposed algorithms as well as the cooperative relaying mechanism at the CMR can be gainfully employed in a variety of real-life scenarios in LTE networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background and Purpose - Stroke has global importance and it causes an increasing amount of human suffering and economic burden, but its management is far from optimal. The unsuccessful outcome of several research programs highlights the need for reliable data on which to plan future clinical trials. The Virtual International Stroke Trials Archive aims to aid the planning of clinical trials by collating and providing access to a rich resource of patient data to perform exploratory analyses. Methods - Data were contributed by the principal investigators of numerous trials from the past 16 years. These data have been centrally collated and are available for anonymized analysis and hypothesis testing. Results - ”Currently, the Virtual International Stroke Trials Archive contains 21 trials. There are data on 15 000 patients with both ischemic and hemorrhagic stroke. Ages range between 18 and 103 years, with a mean age of 6912 years. Outcome measures include the Barthel Index, Scandinavian Stroke Scale, National Institutes of Health Stroke Scale, Orgogozo Scale, and modified Rankin Scale. Medical history and onset-to-treatment time are readily available, and computed tomography lesion data are available for selected trials. Conclusions - This resource has the potential to influence clinical trial design and implementation through data analyses that inform planning. (Stroke. 2007;38:1905-1910.)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Scharff-technique is used for eliciting information from human sources. At the very core of the technique is the “illusion of knowing it all” tactic, which aims to inflate a source's perception of how much knowledge an interviewer holds about the event to be discussed. For the current study, we mapped the effects following two different ways of introducing this particular tactic; a traditional way of implementation where the interviewer explicitly states that s/he already knows most of the important information (the traditional condition), and a new way of implementation where the interviewer just starts to present the information that s/he holds (the just start condition). The two versions were compared in two separate experiments. In Experiment 1 (N = 60), we measured the participants’ perceptions of the interviewer's knowledge, and in Experiment 2 (N = 60), the participants’ perceptions of the interviewer's knowledge gaps. We found that participants in the just start condition (a) believed the interviewer had more knowledge (Experiment 1), and (b) searched less actively for gaps in the interviewer's knowledge (Experiment 2), compared to the traditional condition. We will discuss the current findings and how sources test and perceive the knowledge his or her interviewer possesses within a framework of social hypothesis testing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To compare the eficacy and safety of 4 mg of ondansetron vs. 4 mg of nalbuphine for the treatment of neuraxial morphine-induced pruritus, in patients at the “Dr. José Eleuterio González” University Hospital from September 2012 to August 2013. Material and methods: A controlled, prospective, randomized study of 28 patients (14 per group) receiving neuraxial morphine analgesia was conducted, which was registered and approved by the ethics Committee of the Institution and patients agreed to participate in the study under informed consent. The results were segmented and contrasted (according to drug) by hypothesis testing; the association was determined by X2 with a 95% conidence interval (CI). Results: Pruritus was effectively resolved in both groups and no signiicant difference was found in the rest of the variables. An increase in the visual analogue scale (eVA) was observed at 6 and 12 hours for the ondansetron group, which was statistically signiicant (p≤0.05), however both groups had an eVA of less than 3. Conclusions: When comparing the eficacy and safety of ondansetron 4 mg vs. nalbuphine 4 mg for the treatment of neuraxial morphine induced pruritus, the only signiicant difference found was the mean eVA at 6 and 12 hours, favoring the ondansetron group. However, both groups scored less than 3 on the eVA. Therefore, we consider that both treatments are effective and safe in the treatment of pruritus caused by neuraxial morphine.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Face ao paradigma atual onde são constantemente impostas às entidades públicas medidas para a racionalização de recursos, os Estabelecimentos de Ensino Superior Público Universitário Militar não são exceção tornando-se cada vez mais premente a aposta numa gestão eficiente e eficaz. Neste âmbito, a Contabilidade Analítica assume de forma crescente um papel dominante na análise e controlo dos custos por atividade. O presente Trabalho de Investigação Aplicada encontra-se subordinado ao tema “A Formação de Oficiais de Administração: Oportunidades, Especificidades e Contingências na senda de uma Carreira Profissional”. Assim, o objetivo geral do presente trabalho passa pelo cálculo do custos de formação dos alunos de Administração dos três ramos das Forças Armadas e desta forma, optar pelo modelo mais rentável economicamente. Para o cálculo do custo, de entre as inúmeras opções existentes relativamente a sistemas de custeio, baseámo-nos no método das Secções Homogéneas ou Centros de Custos. A estrutura do trabalho pode ser dividida em duas partes, a primeira de cariz teórico e a segunda uma vertente prática. A metodologia adotada teve como referência o método de investigação em Ciência Sociais, isto é, partindo de uma pergunta central de investigação, que origina perguntas derivadas, procuram-se respostas através da formulação, exploração e teste de hipóteses. De acordo com os resultados do presente estudo podemos verificar que é o modelo de formação utilizada na Academia Militar o mais rentável economicamente. Desta forma, dadas as evidentes afinidades científicas existentes entre os cursos seria pertinente uma reconfiguração da estrutura científica, durações e do perfil formativo dos diferentes cursos. Assim, uma reorganização que elimine redundâncias e promova a partilha de recursos possibilitará ganhos de eficiência na gestão e consequentemente redução de custos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps rubella vaccine and autism controversy in the UK to illustrate a number of threshold concepts underlying good study design and interpretation of scientific evidence. These include the necessity of sufficient sample size, representative and random sampling, appropriate controls and inferring causation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La actividad física regular desempeña un papel fundamental en la prevención y control de los desórdenes musculo esqueléticos, dentro de la actividad laboral del profesor de educación física. Objetivo: El propósito del estudio fue determinar la relación entre los niveles de actividad física y la prevalencia de los desórdenes musculo esqueléticos, en profesores de educación física de 42 instituciones educativas oficiales de Bogotá-Colombia. Métodos. Se trata de un estudio de corte transversal en 262 profesores de educación física, de 42 instituciones educativas oficiales de Bogotá - Colombia. Se aplicó de manera auto-diligenciada el Cuestionario Nórdico de desórdenes músculos esqueléticos y el Cuestionario IPAQ versión corta para identificar los niveles de actividad física. Se obtuvieron medidas de tendencia central y de dispersión para variables cuantitativas y frecuencias relativas para variables cualitativas. Se calculó la prevalencia de vida y el porcentaje de reubicación laboral en los docentes que habían padecido diferentes tipo de dolor. Para estimar la relación entre el dolor y las variables sociodemográficas de los docentes, se utilizó un modelo de regresión logística binaria simple. Los análisis fueron realizados en SPSS versión 20 y se consideró como significativo un valor p < 0.05 para el contraste de hipótesis y un nivel de confianza para la estimación de parámetros. Resultados: El porcentaje de respuesta fue del 83.9%, se consideraron válidos 262 registros, 22.5% eran de género femenino, la mayor cantidad de docentes de educación física se encuentraon entre 25 y 35 años (43,9%), frente a los desórdenes musculo esqueléticos, el 16.9% de los profesores reporto haberlos sufrido alguna vez molestias en el cuello, el 17,2% en el hombro, 27,9% espalda, 7.93% brazo y en mano el 8.4%. Los profesores con mayores niveles de actividad física, reportaron una prevalencia menor de alteraciones musculo esqueléticas de 16,9 % para cuello; 27.7% para dorsal/lumbar frente a los sujetos con niveles bajos de actividad física. La presencia de los desórdenes se asoció a los años de experiencia (OR 3.39 IC95% 1.41-7.65), a pertenecer al género femenino (OR 4.94 IC95% 1.94-12.59), a la edad (OR 5.06 IC95% 1.25-20.59), y al atender más de 400 estudiantes a cargo dentro de la jornada laboral (OR 4.50 IC95% 1.74-11.62). Conclusiones: En los profesores de Educación Física no sé encontró una relación estadísticamente significativa entre los niveles de actividad física y los desórdenes musculo esqueléticos medidos por auto reporte.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis project studies the agent identity privacy problem in the scalar linear quadratic Gaussian (LQG) control system. For the agent identity privacy problem in the LQG control, privacy models and privacy measures have to be established first. It depends on a trajectory of correlated data rather than a single observation. I propose here privacy models and the corresponding privacy measures by taking into account the two characteristics. The agent identity is a binary hypothesis: Agent A or Agent B. An eavesdropper is assumed to make a hypothesis testing on the agent identity based on the intercepted environment state sequence. The privacy risk is measured by the Kullback-Leibler divergence between the probability distributions of state sequences under two hypotheses. By taking into account both the accumulative control reward and privacy risk, an optimization problem of the policy of Agent B is formulated. The optimal deterministic privacy-preserving LQG policy of Agent B is a linear mapping. A sufficient condition is given to guarantee that the optimal deterministic privacy-preserving policy is time-invariant in the asymptotic regime. An independent Gaussian random variable cannot improve the performance of Agent B. The numerical experiments justify the theoretic results and illustrate the reward-privacy trade-off. Based on the privacy model and the LQG control model, I have formulated the mathematical problems for the agent identity privacy problem in LQG. The formulated problems address the two design objectives: to maximize the control reward and to minimize the privacy risk. I have conducted theoretic analysis on the LQG control policy in the agent identity privacy problem and the trade-off between the control reward and the privacy risk.Finally, the theoretic results are justified by numerical experiments. From the numerical results, I expected to have some interesting observations and insights, which are explained in the last chapter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An optimal multiple testing procedure is identified for linear hypotheses under the general linear model, maximizing the expected number of false null hypotheses rejected at any significance level. The optimal procedure depends on the unknown data-generating distribution, but can be consistently estimated. Drawing information together across many hypotheses, the estimated optimal procedure provides an empirical alternative hypothesis by adapting to underlying patterns of departure from the null. Proposed multiple testing procedures based on the empirical alternative are evaluated through simulations and an application to gene expression microarray data. Compared to a standard multiple testing procedure, it is not unusual for use of an empirical alternative hypothesis to increase by 50% or more the number of true positives identified at a given significance level.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

It is common in econometric applications that several hypothesis tests arecarried out at the same time. The problem then becomes how to decide whichhypotheses to reject, accounting for the multitude of tests. In this paper,we suggest a stepwise multiple testing procedure which asymptoticallycontrols the familywise error rate at a desired level. Compared to relatedsingle-step methods, our procedure is more powerful in the sense that itoften will reject more false hypotheses. In addition, we advocate the useof studentization when it is feasible. Unlike some stepwise methods, ourmethod implicitly captures the joint dependence structure of the teststatistics, which results in increased ability to detect alternativehypotheses. We prove our method asymptotically controls the familywise errorrate under minimal assumptions. We present our methodology in the context ofcomparing several strategies to a common benchmark and deciding whichstrategies actually beat the benchmark. However, our ideas can easily beextended and/or modied to other contexts, such as making inference for theindividual regression coecients in a multiple regression framework. Somesimulation studies show the improvements of our methods over previous proposals. We also provide an application to a set of real data.