989 resultados para credible commitments.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two studies investigated the influence of juror need for cognition on the systematic and heuristic processing of expert evidence. U.S. citizens reporting for jury duty in South Florida read a 15-page summary of a hostile work environment case containing expert testimony. The expert described a study she had conducted on the effects of viewing sexualized materials on men's behavior toward women. Certain methodological features of the expert's research varied across experimental conditions. In Study 1 (N = 252), the expert's study was valid, contained a confound, or included the potential for experimenter bias (internal validity) and relied on a small or large sample (sample size) of college undergraduates or trucking employees (ecological validity). When the expert's study included trucking employees, high need for cognition jurors in Study 1 rated the expert more credible and trustworthy than did low need for cognition jurors. Jurors were insensitive to variations in the study's internal validity or sample size. Juror ratings of plaintiff credibility, plaintiff trustworthiness, and study quality were positively correlated with verdict. In Study 2 (N = 162), the expert's published or unpublished study (general acceptance) was either valid or lacked an appropriate control group (internal validity) and included a sample of college undergraduates or trucking employees (ecological validity). High need for cognition jurors in Study 2 found the defendant liable more often and evaluated the expert evidence more favorably when the expert's study was internally valid than when an appropriate control group was missing. Low need for cognition jurors did not differentiate between the internally valid and invalid study. Variations in the study's general acceptance and ecological validity did not affect juror judgments. Juror ratings of expert and plaintiff credibility, plaintiff trustworthiness, and study quality were positively correlated with verdict. The present research demonstrated that the need for cognition moderates juror sensitivity to expert evidence quality and that certain message-related heuristics influence juror judgments when ability or motivation to process systematically is low. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In broad terms — including a thief's use of existing credit card, bank, or other accounts — the number of identity fraud victims in the United States ranges 9-10 million per year, or roughly 4% of the US adult population. The average annual theft per stolen identity was estimated at $6,383 in 2006, up approximately 22% from $5,248 in 2003; an increase in estimated total theft from $53.2 billion in 2003 to $56.6 billion in 2006. About three million Americans each year fall victim to the worst kind of identity fraud: new account fraud. Names, Social Security numbers, dates of birth, and other data are acquired fraudulently from the issuing organization, or from the victim then these data are used to create fraudulent identity documents. In turn, these are presented to other organizations as evidence of identity, used to open new lines of credit, secure loans, “flip” property, or otherwise turn a profit in a victim's name. This is much more time consuming — and typically more costly — to repair than fraudulent use of existing accounts. ^ This research borrows from well-established theoretical backgrounds, in an effort to answer the question – what is it that makes identity documents credible? Most importantly, identification of the components of credibility draws upon personal construct psychology, the underpinning for the repertory grid technique, a form of structured interviewing that arrives at a description of the interviewee’s constructs on a given topic, such as credibility of identity documents. This represents substantial contribution to theory, being the first research to use the repertory grid technique to elicit from experts, their mental constructs used to evaluate credibility of different types of identity documents reviewed in the course of opening new accounts. The research identified twenty-one characteristics, different ones of which are present on different types of identity documents. Expert evaluations of these documents in different scenarios suggest that visual characteristics are most important for a physical document, while authenticated personal data are most important for a digital document. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cohort programs have been instituted at many universities to accommodate the growing number of mature adult graduate students who pursue degrees while maintaining multiple commitments such as work and family. While it is estimated that as many as 40–60% of students who begin graduate study fail to complete degrees, it is thought that attrition may be even higher for this population of students. Yet, little is known about the impact of cohorts on the learning environment and whether cohort programs affect graduate student retention. Retention theory stresses the importance of the academic department, quality of faculty-student relationships and student involvement in the life of the academic community as critical determinants in students' decisions to persist to degree completion. However, students who are employed full-time typically spend little time on campus engaged in the learning environment. Using academic and social integration theory, this study examined the experiences of working adult graduate students enrolled in cohort (CEP) and non-cohort (non-CEP) programs and the influence of these experiences on intention to persist. The Graduate Program Context Questionnaire was administered to graduate students (N = 310) to examine measures of academic and social integration and intention to persist. Sample t tests and ANOVAs were conducted to determine whether differences in perceptions could be identified between cohort and non-cohort students. Multiple linear regression was used to identify variables that predict students' intention to persist. While there were many similarities, significant differences were found between CEP and non-CEP student groups on two measures. CEP students rated peer-student relationships higher and scored higher on the intention to persist measure than non-CEP students. The psychological integration measure, however, was the strongest predictor of intention to persist for both the CEP and non-CEP groups. This study supports the research literature which suggests that CEP programs encourage the development of peer-student relationships and promote students' commitment to persistence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the arrival of the first African slaves to Cuba in 1524, the issue of race has had a long-lived presence in the Cuban national discourse. However, despite Cuba’s colonial history, it has often been maintained by some historians that race relations in Cuba were congenial with racism and racial discrimination never existing as deep or widespread in Cuba as in the United States (Cannon, 1983, p. 113). In fact, it has been argued that institutionalized racism was introduced into Cuban society with the first U.S. occupation, during 1898–1902 (Cannon, 1983, p. 113). This study of Cuba investigates the influence of the United States on the development of race relations and racial perceptions in post-independent Cuba, specifically from 1898-1902. These years comprise the time period immediately following the final fight for Cuban Independence, culminating with the Cuban-Spanish-American War and the first U.S. occupation of Cuba. By this time, the Cuban population comprised Africans as well as descendants of Africans, White Spanish people, indigenous Cubans, and offspring of the intermixing of the groups. This research studies whether the United States’ own race relations and racial perceptions influenced the initial conflicting race relations and racial perceptions in early and post-U.S. occupation Cuba. This study uses a collective interpretative framework that incorporates a national level of analysis with a race relations and racial perceptions focus. This framework reaches beyond the traditionally utilized perspectives when interpreting the impact of the United States during and following its intervention in Cuba. Attention is given to the role of the existing social, political climate within the United States as a driving influence of the United States’ involvement with Cuba. This study reveals that emphasis on the role of the United States as critical to the development of Cuba’s race relations and racial perceptions is credible given the extensive involvement of the U.S. in the building of the early Cuban Republic and U.S. structures serving as models for reconstruction. U.S. government formation in Cuba aligned with a governing system reflecting the existing governing codes of the U.S. during that time period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study explored when, under what conditions, and to what extent did European integration, particularly the European Union’s requirement for democratic conditionality, contribute to democratic consolidation in Spain, Poland, and Turkey? On the basis of a four-part definition, the dissertation examined the democratizing impact of European integration process on each of the following four components of consolidation: (i) holding of fair, free and competitive elections, (ii) protection of fundamental rights, including human and minority rights, (iii) high prospects of regime survival and civilian control of the military, and (iv) legitimacy, elite consensus, and stateness. To assess the relative significance of EU’s democratizing leverage, the thesis also examined domestic and non-EU international dynamics of democratic consolidation in the three countries. By employing two qualitative methods (case study and process-tracing), the study focused on three specific time frames: 1977–1986 for Spain, 1994–2004 for Poland, and 1999–present for Turkey. In addition to official documents, newspapers, and secondary sources, face-to-face interviews made with politicians, academics, experts, bureaucrats, and journalists in the three countries were utilized. The thesis generated several conclusions. First of all, the EU’s democratizing impact is not uniform across different components of democratic consolidation. Moreover, the EU’s democratizing leverage in Spain, Poland, and Turkey involved variations over time for three major reasons: (i) the changing nature of EU’s democratic conditionality over time (ii) varying levels of the EU’s credible commitment to the candidate country’s prospect for membership, and (iii) domestic dynamics in the candidate countries. Furthermore, the European integration process favors democratic consolidation but its magnitude is shaped by the candidate country’s prospect for EU membership and domestic factors in the candidate country. Finally, the study involves a major policy implication for the European Union: unless the EU provides a clear prospect for membership, its democratizing leverage will be limited in the candidate countries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Developing scientifically credible tools for measuring the success of ecological restoration projects is a difficult and a non-trivial task. Yet, reliable measures of the general health and ecological integrity of ecosystems are critical for assessing the success of restoration programs. The South Florida Ecosystem Restoration Task Force (Task Force), which helps coordinate a multi-billion dollar multi-organizational effort between federal, state, local and tribal governments to restore the Florida Everglades, is using a small set of system-wide ecological indicators to assess the restoration efforts. A team of scientists and managers identified eleven ecological indicators from a field of several hundred through a selection process using 12 criteria to determine their applicability as part of a system-wide suite. The 12 criteria are: (1) is the indicator relevant to the ecosystem? (2) Does it respond to variability at a scale that makes it applicable to the entire system? (3) Is the indicator feasible to implement and is it measureable? (4) Is the indicator sensitive to system drivers and is it predictable? (5) Is the indicator interpretable in a common language? (6) Are there situations where an optimistic trend with regard to an indicator might suggest a pessimistic restoration trend? (7) Are there situations where a pessimistic trend with regard to an indicator may be unrelated to restoration activities? (8) Is the indicator scientifically defensible? (9) Can clear, measureable targets be established for the indicator to allow for assessments of success? (10) Does the indicator have specificity to be able to result in corrective action? (11) What level of ecosystem process or structure does the indicator address? (12) Does the indicator provide early warning signs of ecological change? In addition, a two page stoplight report card was developed to assist in communicating the complex science inherent in ecological indicators in a common language for resource managers, policy makers and the public. The report card employs a universally understood stoplight symbol that uses green to indicate that targets are being met, yellow to indicate that targets have not been met and corrective action may be needed and red to represent that targets are far from being met and corrective action is required. This paper presents the scientific process and the results of the development and selection of the criteria, the indicators and the stoplight report card format and content. The detailed process and results for the individual indicators are presented in companion papers in this special issue of Ecological Indicators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Perception of self as a non-reader has been identified as one of the factors why poor readers disengage from the reading process (Strang, 1967; Rosow, 1992), thus impeding progress. Perception and informational processes influence judgments of personal efficacy (Bandura, 1997). The student's sense of reading efficacy that influence effort expenditure and ultimately achievement, is often overlooked (Athey, 1985; Pajares, 1996). Academic routines within educational programs are implemented without adequate information on whether routines promote or impede efficacy growth. Cross-age tutoring, a process known to improve participants' academic achievement, motivation, and provide opportunities for authentic reading practice, has been successfully incorporated into reading instruction designs (Allen, 1976; Cohen, Kulik & Kulik, 1982; Labbo & Teale, 1990; Riessman, 1993). This study investigated the impact teacher-designed routines within a cross-age tutoring model, have on the tutor's sense of reading self-efficacy. ^ The Reader Self-Perception Scale (Henk & Melnick, 1992) was administered, pre- and post-treatment, to 118 fifth grade students. Preceding the initial survey administration intact classes were randomly assigned to 1 of 3 commonly utilized cross-age tutoring routines or designated as the non-treatment population. The data derived from the Reader Self-Perception Scale was analyzed using an analysis of covariance (ANCOVA). Results indicated that participation as a cross-age tutor does not significantly increase the tutor's perception of self as reader in 1 or more of the 4 modes of information influencing self-efficacy as compared to the non-treatment group. ^ The results of this study suggests that although a weekly tutoring session that delivers educationally credible routines impacts achievement and motivation, efficacy effect was not evident. Possible explanation and recommendations for future studies are proposed. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In broad terms — including a thief's use of existing credit card, bank, or other accounts — the number of identity fraud victims in the United States ranges 9-10 million per year, or roughly 4% of the US adult population. The average annual theft per stolen identity was estimated at $6,383 in 2006, up approximately 22% from $5,248 in 2003; an increase in estimated total theft from $53.2 billion in 2003 to $56.6 billion in 2006. About three million Americans each year fall victim to the worst kind of identity fraud: new account fraud. Names, Social Security numbers, dates of birth, and other data are acquired fraudulently from the issuing organization, or from the victim then these data are used to create fraudulent identity documents. In turn, these are presented to other organizations as evidence of identity, used to open new lines of credit, secure loans, “flip” property, or otherwise turn a profit in a victim's name. This is much more time consuming — and typically more costly — to repair than fraudulent use of existing accounts. This research borrows from well-established theoretical backgrounds, in an effort to answer the question – what is it that makes identity documents credible? Most importantly, identification of the components of credibility draws upon personal construct psychology, the underpinning for the repertory grid technique, a form of structured interviewing that arrives at a description of the interviewee’s constructs on a given topic, such as credibility of identity documents. This represents substantial contribution to theory, being the first research to use the repertory grid technique to elicit from experts, their mental constructs used to evaluate credibility of different types of identity documents reviewed in the course of opening new accounts. The research identified twenty-one characteristics, different ones of which are present on different types of identity documents. Expert evaluations of these documents in different scenarios suggest that visual characteristics are most important for a physical document, while authenticated personal data are most important for a digital document.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Engineering analysis in geometric models has been the main if not the only credible/reasonable tool used by engineers and scientists to resolve physical boundaries problems. New high speed computers have facilitated the accuracy and validation of the expected results. In practice, an engineering analysis is composed of two parts; the design of the model and the analysis of the geometry with the boundary conditions and constraints imposed on it. Numerical methods are used to resolve a large number of physical boundary problems independent of the model geometry. The time expended due to the computational process are related to the imposed boundary conditions and the well conformed geometry. Any geometric model that contains gaps or open lines is considered an imperfect geometry model and major commercial solver packages are incapable of handling such inputs. Others packages apply different kinds of methods to resolve this problems like patching or zippering; but the final resolved geometry may be different from the original geometry, and the changes may be unacceptable. The study proposed in this dissertation is based on a new technique to process models with geometrical imperfection without the necessity to repair or change the original geometry. An algorithm is presented that is able to analyze the imperfect geometric model with the imposed boundary conditions using a meshfree method and a distance field approximation to the boundaries. Experiments are proposed to analyze the convergence of the algorithm in imperfect models geometries and will be compared with the same models but with perfect geometries. Plotting results will be presented for further analysis and conclusions of the algorithm convergence

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is a comparative study of three black brotherhoods that existed in Pernambuco in the eighteenth century, it is the Brotherhood of Our Lady of the Rosary of Black Men of Recife, Olinda and Goiás. The goal was to understand the similarities and differences between them, taking as benchmark their operating statutes, called Appointments. From the data analysis of the commitments associated with other documents produced by the Brotherhoods and the administrative and religious authorities, we sought the social profile of the villages in evidence, as well as the participation of black people inside. We sought to understand the historical conditions of that period, from the fact that the slave society, the black was placed in a position of subordination. However, as a carrier element of culture, although this condition, was able to overcome social obstacles, opening possibilities for own cultural manifestations of his group could occur. The coexistence in the Brotherhoods of the Rosary, which in addition to organizations for mutual assistance within the Catholic religion, is also constituted as fields mediators between high culture and popular culture, made those organizations become social spaces and representation allowed the existing order. The Brotherhoods of the Rosary in Recife, Olinda and Goiás, had their own hierarchical logic that engendered the construction of new black identities marked by cultural circularity that became possible due to the Atlantic diaspora process

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work was built aiming to present how they built the speech of the presidential administration of George W. Bush to engender the Wars on Terror. Through an analysis of sources, magazines, newspapers and official speeches of the President; construct a survey that shows the process of development discourse of the U.S. government in order to make credible to the world the existence of weapons of mass destruction in Iraq. To accomplish this feat, the first attempts to deconstruct the work that would be the terrorist and their actions against the hegemonic governments, and perform an important discussion with the theme of the story of the present time and the need for a search like this nowadays. To deconstruct the idea of being a terrorist present as President George W. Bush uses the attacks of September 11th and fear as tools to build a war with a real intentionality toward the conquest of Iraqi oil and finish a task that his father, George H. Bush had left unfinished.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Studies reveal that in recent decades a decrease in sleep duration has occurred. Social commitments, such as work and school are often not aligned to the "biological time" of individuals. Added to this, there is a reduced force of zeitgeber caused by less exposure to daylight and larger exposure to evenings. This causes a chronic sleep debt that is offset in a free days. Indeed, a restriction and extent of sleep called "social Jet lag" occurs weekly. Sleep deprivation has been associated to obesity, cancer, and cardiovascular risk. It is suggested that the autonomic nervous system is a pathway that connects sleep problems to cardiovascular diseases. However, beyond the evidence demonstrated by studies using models of acute and controlled sleep deprivation, studies are needed to investigate the effects of chronic sleep deprivation as it occurs in the social jet lag. The aim of this study was to investigate the influence of social jet lag in circadian rest-activity markers and heart function in medical students. It is a cross-sectional, observational study conducted in the Laboratory of Neurobiology and Biological Rhythmicity (LNRB) at the Department of Physiology UFRN. Participated in the survey medical students enrolled in the 1st semester of their course at UFRN. Instruments for data collection: Munich Chronotype Questionnaire, Morningness Eveningness Questionnaire of Horne and Östberg, Pittsburgh Sleep Quality Index, Epworth Sleepiness Scale, Actimeter; Heart rate monitor. Analysed were descriptive variables of sleep, nonparametric (IV60, IS60, L5 and M10) and cardiac indexes of time domain, frequency (LF, HF LF / HF) and nonlinear (SD1, SD2, SD1 / SD2). Descriptive, comparative and correlative statistical analysis was performed with SPSS software version 20. 41 students participated in the study, 48.8% (20) females and 51.2% (21) males, 19.63 ± 2.07 years. The social jet lag had an average of 02: 39h ± 00:55h, 82.9% (34) with social jet lag ≥ 1h and there was a negative correlation with the Munich chronotype score indicating greater sleep deprivation in subjects prone to eveningness. Poor sleep quality was detected in 90.2% (37) (X2 = 26.56, p <0.001) and 56.1% (23) excessive daytime sleepiness (X2 = 0.61, p = 0.435). Significant differences were observed in the values of LFnu, HFnu and LF / HF between the groups of social jet lag <2h and ≥ 2h and correlation of the social jet lag with LFnu (rs = 0.354, p = 0.023), HFnu (rs = - 0.354 , p = 0.023) and LF / HF (r = 0.355, p = 0.023). There was also a negative association between IV60 and indexes in the time domain and non-linear. It is suggested that chronic sleep deprivation may be associated with increased sympathetic activation promoting greater cardiovascular risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A tese investiga se, mesmo com a livre produção de notícias, indivíduos que postam informações em websites de conteúdos colaborativos, denominados como interagentes, tendem a reproduzir informações que foram agendadas por telejornais. Para verificar essa análise, este estudo faz uma comparação entre os veículos de conteúdos colaborativos Vc repórter, Vc no G1 e Eu repórter com os telejornais SBT Brasil, Jornal Nacional, Jornal da Record e Jornal da Band, buscando averiguar se os referidos telejornais pautam as plataformas colaborativas. A hipótese norteadora parte da premissa que os telejornais brasileiros vêm construindo ao longo do tempo um vínculo de credibilidade com o telespectador. Portanto, é possível projetar que o interagente utilize os mesmos critérios de escolha dos broadcasts e reproduza informações semelhantes em sites de conteúdo colaborativo. O método utilizado foi a análise de conteúdo e tem como base os estudos de Laurence Bardin. O tipo de pesquisa adotado foi o quantitativo e para isso foi construído um sistema computacional denominado New Crawler App para coletar o material utilizado nesta pesquisa. Concluiu-se que, dentro da amostra do universo pesquisado, há agendamentos dos telejornais frente ao conteúdo colaborativo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some arguments are briefly presented about the negative consequences of the deep global economic and financial crisis of 2008 on the economic activity and the social situation in Spain. Reformulation, sustainability and financial viability of social welfare in Spain require a new management through resource efficiency, increasing market presence and initiative of stakeholders as a whole. In this sense, the main credible argument of the welfare social in Spain depends on a new perspective on socialization and generosity of social protection system. Specifically, the solution to the crisis must come through economic growth, increased productivity, employment and competitiveness and not by the way of increasing levels of social protection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many modern applications fall into the category of "large-scale" statistical problems, in which both the number of observations n and the number of features or parameters p may be large. Many existing methods focus on point estimation, despite the continued relevance of uncertainty quantification in the sciences, where the number of parameters to estimate often exceeds the sample size, despite huge increases in the value of n typically seen in many fields. Thus, the tendency in some areas of industry to dispense with traditional statistical analysis on the basis that "n=all" is of little relevance outside of certain narrow applications. The main result of the Big Data revolution in most fields has instead been to make computation much harder without reducing the importance of uncertainty quantification. Bayesian methods excel at uncertainty quantification, but often scale poorly relative to alternatives. This conflict between the statistical advantages of Bayesian procedures and their substantial computational disadvantages is perhaps the greatest challenge facing modern Bayesian statistics, and is the primary motivation for the work presented here.

Two general strategies for scaling Bayesian inference are considered. The first is the development of methods that lend themselves to faster computation, and the second is design and characterization of computational algorithms that scale better in n or p. In the first instance, the focus is on joint inference outside of the standard problem of multivariate continuous data that has been a major focus of previous theoretical work in this area. In the second area, we pursue strategies for improving the speed of Markov chain Monte Carlo algorithms, and characterizing their performance in large-scale settings. Throughout, the focus is on rigorous theoretical evaluation combined with empirical demonstrations of performance and concordance with the theory.

One topic we consider is modeling the joint distribution of multivariate categorical data, often summarized in a contingency table. Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. In Chapter 2, we derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions.

Latent class models for the joint distribution of multivariate categorical, such as the PARAFAC decomposition, data play an important role in the analysis of population structure. In this context, the number of latent classes is interpreted as the number of genetically distinct subpopulations of an organism, an important factor in the analysis of evolutionary processes and conservation status. Existing methods focus on point estimates of the number of subpopulations, and lack robust uncertainty quantification. Moreover, whether the number of latent classes in these models is even an identified parameter is an open question. In Chapter 3, we show that when the model is properly specified, the correct number of subpopulations can be recovered almost surely. We then propose an alternative method for estimating the number of latent subpopulations that provides good quantification of uncertainty, and provide a simple procedure for verifying that the proposed method is consistent for the number of subpopulations. The performance of the model in estimating the number of subpopulations and other common population structure inference problems is assessed in simulations and a real data application.

In contingency table analysis, sparse data is frequently encountered for even modest numbers of variables, resulting in non-existence of maximum likelihood estimates. A common solution is to obtain regularized estimates of the parameters of a log-linear model. Bayesian methods provide a coherent approach to regularization, but are often computationally intensive. Conjugate priors ease computational demands, but the conjugate Diaconis--Ylvisaker priors for the parameters of log-linear models do not give rise to closed form credible regions, complicating posterior inference. In Chapter 4 we derive the optimal Gaussian approximation to the posterior for log-linear models with Diaconis--Ylvisaker priors, and provide convergence rate and finite-sample bounds for the Kullback-Leibler divergence between the exact posterior and the optimal Gaussian approximation. We demonstrate empirically in simulations and a real data application that the approximation is highly accurate, even in relatively small samples. The proposed approximation provides a computationally scalable and principled approach to regularized estimation and approximate Bayesian inference for log-linear models.

Another challenging and somewhat non-standard joint modeling problem is inference on tail dependence in stochastic processes. In applications where extreme dependence is of interest, data are almost always time-indexed. Existing methods for inference and modeling in this setting often cluster extreme events or choose window sizes with the goal of preserving temporal information. In Chapter 5, we propose an alternative paradigm for inference on tail dependence in stochastic processes with arbitrary temporal dependence structure in the extremes, based on the idea that the information on strength of tail dependence and the temporal structure in this dependence are both encoded in waiting times between exceedances of high thresholds. We construct a class of time-indexed stochastic processes with tail dependence obtained by endowing the support points in de Haan's spectral representation of max-stable processes with velocities and lifetimes. We extend Smith's model to these max-stable velocity processes and obtain the distribution of waiting times between extreme events at multiple locations. Motivated by this result, a new definition of tail dependence is proposed that is a function of the distribution of waiting times between threshold exceedances, and an inferential framework is constructed for estimating the strength of extremal dependence and quantifying uncertainty in this paradigm. The method is applied to climatological, financial, and electrophysiology data.

The remainder of this thesis focuses on posterior computation by Markov chain Monte Carlo. The Markov Chain Monte Carlo method is the dominant paradigm for posterior computation in Bayesian analysis. It has long been common to control computation time by making approximations to the Markov transition kernel. Comparatively little attention has been paid to convergence and estimation error in these approximating Markov Chains. In Chapter 6, we propose a framework for assessing when to use approximations in MCMC algorithms, and how much error in the transition kernel should be tolerated to obtain optimal estimation performance with respect to a specified loss function and computational budget. The results require only ergodicity of the exact kernel and control of the kernel approximation accuracy. The theoretical framework is applied to approximations based on random subsets of data, low-rank approximations of Gaussian processes, and a novel approximating Markov chain for discrete mixture models.

Data augmentation Gibbs samplers are arguably the most popular class of algorithm for approximately sampling from the posterior distribution for the parameters of generalized linear models. The truncated Normal and Polya-Gamma data augmentation samplers are standard examples for probit and logit links, respectively. Motivated by an important problem in quantitative advertising, in Chapter 7 we consider the application of these algorithms to modeling rare events. We show that when the sample size is large but the observed number of successes is small, these data augmentation samplers mix very slowly, with a spectral gap that converges to zero at a rate at least proportional to the reciprocal of the square root of the sample size up to a log factor. In simulation studies, moderate sample sizes result in high autocorrelations and small effective sample sizes. Similar empirical results are observed for related data augmentation samplers for multinomial logit and probit models. When applied to a real quantitative advertising dataset, the data augmentation samplers mix very poorly. Conversely, Hamiltonian Monte Carlo and a type of independence chain Metropolis algorithm show good mixing on the same dataset.