986 resultados para sample complexity
Resumo:
Humans are particularly adept at modifying their behavior in accordance with changing environmental demands. Through various mechanisms of cognitive control, individuals are able to tailor actions to fit complex short- and long-term goals. The research described in this thesis uses functional magnetic resonance imaging to characterize the neural correlates of cognitive control at two levels of complexity: response inhibition and self-control in intertemporal choice. First, we examined changes in neural response associated with increased experience and skill in response inhibition; successful response inhibition was associated with decreased neural response over time in the right ventrolateral prefrontal cortex, a region widely implicated in cognitive control, providing evidence for increased neural efficiency with learned automaticity. We also examined a more abstract form of cognitive control using intertemporal choice. In two experiments, we identified putative neural substrates for individual differences in temporal discounting, or the tendency to prefer immediate to delayed rewards. Using dynamic causal models, we characterized the neural circuit between ventromedial prefrontal cortex, an area involved in valuation, and dorsolateral prefrontal cortex, a region implicated in self-control in intertemporal and dietary choice, and found that connectivity from dorsolateral prefrontal cortex to ventromedial prefrontal cortex increases at the time of choice, particularly when delayed rewards are chosen. Moreover, estimates of the strength of connectivity predicted out-of-sample individual rates of temporal discounting, suggesting a neurocomputational mechanism for variation in the ability to delay gratification. Next, we interrogated the hypothesis that individual differences in temporal discounting are in part explained by the ability to imagine future reward outcomes. Using a novel paradigm, we imaged neural response during the imagining of primary rewards, and identified negative correlations between activity in regions associated the processing of both real and imagined rewards (lateral orbitofrontal cortex and ventromedial prefrontal cortex, respectively) and the individual temporal discounting parameters estimated in the previous experiment. These data suggest that individuals who are better able to represent reward outcomes neurally are less susceptible to temporal discounting. Together, these findings provide further insight into role of the prefrontal cortex in implementing cognitive control, and propose neurobiological substrates for individual variation.
Resumo:
Curve samplers are sampling algorithms that proceed by viewing the domain as a vector space over a finite field, and randomly picking a low-degree curve in it as the sample. Curve samplers exhibit a nice property besides the sampling property: the restriction of low-degree polynomials over the domain to the sampled curve is still low-degree. This property is often used in combination with the sampling property and has found many applications, including PCP constructions, local decoding of codes, and algebraic PRG constructions.
The randomness complexity of curve samplers is a crucial parameter for its applications. It is known that (non-explicit) curve samplers using O(log N + log(1/δ)) random bits exist, where N is the domain size and δ is the confidence error. The question of explicitly constructing randomness-efficient curve samplers was first raised in [TU06] where they obtained curve samplers with near-optimal randomness complexity.
In this thesis, we present an explicit construction of low-degree curve samplers with optimal randomness complexity (up to a constant factor) that sample curves of degree (m logq(1/δ))O(1) in Fqm. Our construction is a delicate combination of several components, including extractor machinery, limited independence, iterated sampling, and list-recoverable codes.
Resumo:
A dissertação trata do acesso aos serviços de alta complexidade, particularmente os exames diagnósticos e complementares, estudado entre usuários de planos de saúde privados que buscam atendimento e diagnóstico especializado. Desde a década de 80 o usuário do sistema público de saúde vem procurando a saúde suplementar. Contudo, afirmar que o acesso é garantido no domínio privado, através da contratação dos planos de saúde, é uma incerteza que rodeia a inspiração para esta pesquisa, que se justifica pela relevância de ações que possibilitem a melhora da qualidade regulatória dos planos de saúde, a partir do controle social de seus usuários. O objetivo geral é analisar as percepções do acesso aos exames de alta complexidade nos serviços de saúde privados entre usuários de planos de saúde. Os objetivos específicos são descrever as percepções dos usuários de planos de saúde acerca do acesso aos exames de alta complexidade; analisar as motivações dos usuários de planos de saúde privados para a realização de exames de alta complexidade através da rede privada de assistência; e analisar o nível de satisfação dos usuários de planos de saúde quanto ao acesso aos exames de alta complexidade. A metodologia é qualitativa-descritiva, onde a amostra foi de trinta usuários de planos de saúde, acima de 18 anos, selecionados no campo de estudo no ano de 2010. O cenário de estudo foi um laboratório privado de medicina diagnóstica no Rio de Janeiro. As técnicas de coleta de dados utilizadas foram formulário e entrevista individual estruturada. A análise do formulário foi realizada através de estatística descritiva, e as entrevistas através da análise de conteúdo temática-categorial. Os usuários de plano de saúde declararam que o acesso é garantido com facilidade para os exames de alta complexidade. Suas principais motivações para a realização desses exames na rede privada de assistência foram caracterizadas pela rapidez de atendimento, flexibilidade e facilidade de marcação pela internet, telefone ou pessoalmente no laboratório estudado, pronta entrega dos resultados, dificuldade e morosidade do atendimento do SUS, localização do prestador credenciado próxima de bairros residenciais ou do trabalho, resolutividade diagnóstica de imagem de excelência, possibilidade de escolha pelo usuário entre as modalidades aberta e fechada de ressonância magnética e tomografia computadorizada, além da densitometria óssea que foram facilmente acessíveis a todos os sujeitos da pesquisa. O nível de satisfação foi correspondido com a rapidez na realização dos exames em caráter eletivo e de urgência quase equiparados na escala de tempo de acordo com os usuários. Contudo, embora as notas de avaliação dos usuários quanto aos seus planos de saúde tenham sido altas, foram abordadas algumas dificuldades, tais como: prazos de validade dos pedidos médicos com datação prévia; solicitações de senhas de autorização pela operadora; burocracia nos procedimentos de agendamento; dificuldades de acesso para tratamentos como implantes, fisioterapia, RPG, pilates, home care, consultas de check up; negação de reembolsos; restrição de materiais cirúrgicos, em especial as próteses e órteses; e restrições específicas de grau para cirurgias de miopia. Conclui-se que o atendimento rápido dos exames de imagem de alto custo na amostra foi descrito como satisfatório, embora a percepção de rapidez possa variar em função do tipo de produto do plano de saúde privado contratado, com necessidade de melhoria regulatória em alguns aspectos pontuais da saúde suplementar.
Resumo:
A new method of frequency-shifting for a diode laser is realized. Using a sample-and-hold circuit, the error signal can be held by the circuit during frequency shifting. It can avoid the restraint of locking or even lock-losing caused by the servo circuit when we input a step-up voltage into piezoelectric transition (PZT) to achieve laser frequency-shifting.
Resumo:
Complexity in the earthquake rupture process can result from many factors. This study investigates the origin of such complexity by examining several recent, large earthquakes in detail. In each case the local tectonic environment plays an important role in understanding the source of the complexity.
Several large shallow earthquakes (Ms > 7.0) along the Middle American Trench have similarities and differences between them that may lead to a better understanding of fracture and subduction processes. They are predominantly thrust events consistent with the known subduction of the Cocos plate beneath N. America. Two events occurring along this subduction zone close to triple junctions show considerable complexity. This may be attributable to a more heterogeneous stress environment in these regions and as such has implications for other subduction zone boundaries.
An event which looks complex but is actually rather simple is the 1978 Bermuda earthquake (Ms ~ 6). It is located predominantly in the mantle. Its mechanism is one of pure thrust faulting with a strike N 20°W and dip 42°NE. Its apparent complexity is caused by local crustal structure. This is an important event in terms of understanding and estimating seismic hazard on the eastern seaboard of N. America.
A study of several large strike-slip continental earthquakes identifies characteristics which are common to them and may be useful in determining what to expect from the next great earthquake on the San Andreas fault. The events are the 1976 Guatemala earthquake on the Motagua fault and two events on the Anatolian fault in Turkey (the 1967, Mudurnu Valley and 1976, E. Turkey events). An attempt to model the complex P-waveforms of these events results in good synthetic fits for the Guatemala and Mudurnu Valley events. However, the E. Turkey event proves to be too complex as it may have associated thrust or normal faulting. Several individual sources occurring at intervals of between 5 and 20 seconds characterize the Guatemala and Mudurnu Valley events. The maximum size of an individual source appears to be bounded at about 5 x 1026 dyne-cm. A detailed source study including directivity is performed on the Guatemala event. The source time history of the Mudurnu Valley event illustrates its significance in modeling strong ground motion in the near field. The complex source time series of the 1967 event produces amplitudes greater by a factor of 2.5 than a uniform model scaled to the same size for a station 20 km from the fault.
Three large and important earthquakes demonstrate an important type of complexity --- multiple-fault complexity. The first, the 1976 Philippine earthquake, an oblique thrust event, represents the first seismological evidence for a northeast dipping subduction zone beneath the island of Mindanao. A large event, following the mainshock by 12 hours, occurred outside the aftershock area and apparently resulted from motion on a subsidiary fault since the event had a strike-slip mechanism.
An aftershock of the great 1960 Chilean earthquake on June 6, 1960, proved to be an interesting discovery. It appears to be a large strike-slip event at the main rupture's southern boundary. It most likely occurred on the landward extension of the Chile Rise transform fault, in the subducting plate. The results for this event suggest that a small event triggered a series of slow events; the duration of the whole sequence being longer than 1 hour. This is indeed a "slow earthquake".
Perhaps one of the most complex of events is the recent Tangshan, China event. It began as a large strike-slip event. Within several seconds of the mainshock it may have triggered thrust faulting to the south of the epicenter. There is no doubt, however, that it triggered a large oblique normal event to the northeast, 15 hours after the mainshock. This event certainly contributed to the great loss of life-sustained as a result of the Tangshan earthquake sequence.
What has been learned from these studies has been applied to predict what one might expect from the next great earthquake on the San Andreas. The expectation from this study is that such an event would be a large complex event, not unlike, but perhaps larger than, the Guatemala or Mudurnu Valley events. That is to say, it will most likely consist of a series of individual events in sequence. It is also quite possible that the event could trigger associated faulting on neighboring fault systems such as those occurring in the Transverse Ranges. This has important bearing on the earthquake hazard estimation for the region.
Resumo:
In the first part of the thesis we explore three fundamental questions that arise naturally when we conceive a machine learning scenario where the training and test distributions can differ. Contrary to conventional wisdom, we show that in fact mismatched training and test distribution can yield better out-of-sample performance. This optimal performance can be obtained by training with the dual distribution. This optimal training distribution depends on the test distribution set by the problem, but not on the target function that we want to learn. We show how to obtain this distribution in both discrete and continuous input spaces, as well as how to approximate it in a practical scenario. Benefits of using this distribution are exemplified in both synthetic and real data sets.
In order to apply the dual distribution in the supervised learning scenario where the training data set is fixed, it is necessary to use weights to make the sample appear as if it came from the dual distribution. We explore the negative effect that weighting a sample can have. The theoretical decomposition of the use of weights regarding its effect on the out-of-sample error is easy to understand but not actionable in practice, as the quantities involved cannot be computed. Hence, we propose the Targeted Weighting algorithm that determines if, for a given set of weights, the out-of-sample performance will improve or not in a practical setting. This is necessary as the setting assumes there are no labeled points distributed according to the test distribution, only unlabeled samples.
Finally, we propose a new class of matching algorithms that can be used to match the training set to a desired distribution, such as the dual distribution (or the test distribution). These algorithms can be applied to very large datasets, and we show how they lead to improved performance in a large real dataset such as the Netflix dataset. Their computational complexity is the main reason for their advantage over previous algorithms proposed in the covariate shift literature.
In the second part of the thesis we apply Machine Learning to the problem of behavior recognition. We develop a specific behavior classifier to study fly aggression, and we develop a system that allows analyzing behavior in videos of animals, with minimal supervision. The system, which we call CUBA (Caltech Unsupervised Behavior Analysis), allows detecting movemes, actions, and stories from time series describing the position of animals in videos. The method summarizes the data, as well as it provides biologists with a mathematical tool to test new hypotheses. Other benefits of CUBA include finding classifiers for specific behaviors without the need for annotation, as well as providing means to discriminate groups of animals, for example, according to their genetic line.
Resumo:
The nuclear resonant reaction 19F(ρ,αγ)16O has been used to perform depth-sensitive analyses of fluorine in lunar samples and carbonaceous chondrites. The resonance at 0.83 MeV (center-of-mass) in this reaction is utilized to study fluorine surface films, with particular interest paid to the outer micron of Apollo 15 green glass, Apollo 17 orange glass, and lunar vesicular basalts. These results are distinguished from terrestrial contamination, and are discussed in terms of a volcanic origin for the samples of interest. Measurements of fluorine in carbonaceous chondrites are used to better define the solar system fluorine abundance. A technique for measurement of carbon on solid surfaces with applications to direct quantitative analysis of implanted solar wind carbon in lunar samples is described.
Resumo:
The first bilateral study of methods of biological sampling and biological methods of water quality assessment took place during June 1977 on selected sampling sites in the catchment of the River Trent (UK). The study was arranged in accordance with the protocol established by the joint working group responsible for the Anglo-Soviet Environmental Agreement. The main purpose of the bilateral study in Nottingham was for some of the methods of sampling and biological assessment used by UK biologists to be demonstrated to their Soviet counterparts and for the Soviet biologists to have the opportunity to test these methods at first hand in order to judge the potential of any of these methods for use within the Soviet Union. This paper is concerned with the nine river stations in the Trent catchment.
Resumo:
In recent collaborative biological sampling exercises organised by the Nottingham Regional Laboratory of the Severn-Trent Water Authority, the effect of handnet sampling variation on the quality and usefulness of the data obtained has been questioned, especially when this data is transcribed into one or more of the commonly used biological methods of water quality assessment. This study investigates if this effect is constant at sites with similar typography but differing water quality states when the sampling method is standardized and carried out by a single operator. An argument is made for the use of a lowest common denominator approach to give a more consistent result and obviate the effect of sampling variation on these biological assessment methods.
Resumo:
Esta dissertação pretende estimar a prevalência de sintomas depressivos em idosos segundo três níveis de complexidade de atenção à saúde e estudar a co-ocorrência de sintomas depressivos e incapacidade funcional. No Brasil, a transição demográfica ocorreu de forma rápida e explosiva. À medida que o número de idosos cresce ocorre o aumento da prevalência de doenças crônicas e suas complicações. A habilidade funcional pode ser vista como uma medida de resumo do impacto geral das condições médicas no contexto do ambiente e do sistema de apoio social do indivíduo, e deve ser uma consideração importante em qualquer planejamento de saúde. Uma enfermidade associada a elevado grau de incapacidade funcional é a depressão. Entre os agravos de saúde mental, a depressão é um dos mais comuns e importantes problemas psiquiátricos entre indivíduos idosos. Trata-se de estudo transversal com tamanho amostral de 643 idosos com idade de 65 ou mais anos selecionados aleatoriamente e usuários de três serviços públicos de saúde com níveis crescentes de complexidade (primário, secundário e terciário). A prevalência de sintomas depressivos foi estimada a partir da EDG-15, já traduzida e validada para uso no Brasil. O nível de estado funcional foi definido conforme os escores dos instrumentos SF-36 e HAQ. A prevalência de sintomas depressivos na amostra total foi de 45,2% (IC=41,1 49,3). Estratificando por unidade, a prevalência foi de 35,3% no nível primário, 47,6% no nível secundário e 51,7% no nível terciário (p=0,004). As prevalências encontradas foram altas nos três níveis de complexidade de atendimento, inclusive na população de idosos da unidade básica de saúde, apesar de serem idosos mais independentes e mais saudáveis. A prevalência geral de sintomas depressivos aumentou à medida que o grau de incapacidade funcional também aumentou. A busca ativa por idosos com sintomas depressivos é importante em todos os níveis de complexidade de atendimento do sistema de saúde.
Resumo:
138 p.
Resumo:
O desenvolvimento da sociedade e da economia requer adaptações necessárias na gestão pública, visando atender aos anseios da coletividade: melhores serviços públicos e a efetiva entrega, com qualidade, dos produtos de suas ações ao cliente-cidadão, alcançando assim o princípio da eficiência. Na extensão em que a quantidade e a complexidade das operações efetuadas pela gestão pública crescem, ampliam-se também os seus riscos. Desta forma, as ações do sistema de controle devem ser intensificadas, sendo possível obter excelentes resultados, com a cooperação técnica entre os órgãos de controle externo e interno. Neste sentido, este estudo teve por objetivo avaliar a percepção dos servidores do município do Rio de Janeiro com atuação no Tribunal de Contas do Município do Rio de Janeiro TCMRJ e na Controladoria Geral do Município CGM, se a sinergia entre os órgãos de controle externo e controle interno, contribui para o aprimoramento do controle na gestão pública municipal. Com o propósito de atender a este objetivo, foi realizada uma pesquisa de natureza descritiva, quantitativa, dedutiva e aplicada. Os dados foram obtidos através de um questionário de pesquisa, disponibilizados na rede intranet da Secretaria Geral de Controle Externo TCMRJ e da CGM. A amostra vinculou-se ao universo dos servidores públicos, com nível de escolaridade ensino médio e superior, gerando 138 respondentes da pesquisa. No que concerne ao resultado, quanto à questão norteadora da pesquisa, 95% dos respondentes da pesquisa concordam que é importante a cooperação técnica entre o TCMRJ e a CGM, para a contribuição do aprimoramento do controle na gestão pública municipal.