10 resultados para Concurrent enrollment

em Universidade do Minho


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biometric systems are increasingly being used as a means for authentication to provide system security in modern technologies. The performance of a biometric system depends on the accuracy, the processing speed, the template size, and the time necessary for enrollment. While much research has focused on the first three factors, enrollment time has not received as much attention. In this work, we present the findings of our research focused upon studying user’s behavior when enrolling in a biometric system. Specifically, we collected information about the user’s availability for enrollment in respect to the hand recognition systems (e.g., hand geometry, palm geometry or any other requiring positioning the hand on an optical scanner). A sample of 19 participants, chosen randomly apart their age, gender, profession and nationality, were used as test subjects in an experiment to study the patience of users enrolling in a biometric hand recognition system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Expanding access to preschool education is a particularly important policy issue in developing countries, where enrollment rates are generally much lower, and where private institutions constitute a much larger share of the formal preschool sector, than in developed countries. This paper examines if an expansion in the supply of public preschool crowds-out private enrollment using rich data for municipalities in Brazil from 2000 to 2006, where federal transfers to local governments change discontinuously with given population thresholds. Results from a regression-discontinuity design reveal that larger federal transfers lead to a significant expansion of local public preschool services, but show no evidence of crowding-out of private enrollment, nor of negative impacts on the quality of private providers. This finding is consistent with a theory in which households differ in willingness-to-pay for preschool services, and private suppliers optimally adjust prices in response to an expansion of lower-quality, free-of-charge public supply. In the context of the model, the absence of crowding-out effects of more public preschool providers can be rationalized by the existence of relatively large differences in willingness-to-pay for preschool services across different demand segments. Our theoretical and empirical findings therefore suggest that in developing country settings characterized by relatively high income inequality, an expansion in public preschool supply will likely significantly increase enrollment among the poorest segments of society, and need not have adverse effects on the quantity or quality of local private supply.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tuberculosis (TB) and human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS) constitute the main burden of infectious disease in resource-limited countries. In the individual host, the two pathogens, Mycobacterium tuberculosis and HIV, potentiate one another, accelerating the deterioration of immunological functions. In high-burden settings, HIV coinfection is the most important risk factor for developing active TB, which increases the susceptibility to primary infection or reinfection and also the risk of TB reactivation for patients with latent TB. M. tuberculosis infection also has a negative impact on the immune response to HIV, accelerating the progression from HIV infection to AIDS. The clinical management of HIV-associated TB includes the integration of effective anti-TB treatment, use of concurrent antiretroviral therapy (ART), prevention of HIV-related comorbidities, management of drug cytotoxicity, and prevention/treatment of immune reconstitution inflammatory syndrome (IRIS).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de mestrado em Optometria Avançada

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large scale distributed data stores rely on optimistic replication to scale and remain highly available in the face of net work partitions. Managing data without coordination results in eventually consistent data stores that allow for concurrent data updates. These systems often use anti-entropy mechanisms (like Merkle Trees) to detect and repair divergent data versions across nodes. However, in practice hash-based data structures are too expensive for large amounts of data and create too many false conflicts. Another aspect of eventual consistency is detecting write conflicts. Logical clocks are often used to track data causality, necessary to detect causally concurrent writes on the same key. However, there is a nonnegligible metadata overhead per key, which also keeps growing with time, proportional with the node churn rate. Another challenge is deleting keys while respecting causality: while the values can be deleted, perkey metadata cannot be permanently removed without coordination. Weintroduceanewcausalitymanagementframeworkforeventuallyconsistentdatastores,thatleveragesnodelogicalclocks(BitmappedVersion Vectors) and a new key logical clock (Dotted Causal Container) to provides advantages on multiple fronts: 1) a new efficient and lightweight anti-entropy mechanism; 2) greatly reduced per-key causality metadata size; 3) accurate key deletes without permanent metadata.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In our work we have chosen to integrate formalism for knowledge representation with formalism for process representation as a way to specify and regulate the overall activity of a multi-cellular agent. The result of this approach is XP,N, another formalism, wherein a distributed system can be modeled as a collection of interrelated sub-nets sharing a common explicit control structure. Each sub-net represents a system of asynchronous concurrent threads modeled by a set of transitions. XP,N combines local state and control with interaction and hierarchy to achieve a high-level abstraction and to model the complex relationships between all the components of a distributed system. Viewed as a tool XP,N provides a carefully devised conflict resolution strategy that intentionally mimics the genetic regulatory mechanism used in an organic cell to select the next genes to process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Higher-dimensional automata constitute a very expressive model for concurrent systems. In this paper, we discuss ``topological abstraction" of higher-dimensional automata, i.e., the replacement of HDAs by smaller ones that can be considered equivalent from the point of view of both computer science and topology. By definition, topological abstraction preserves the homotopy type, the trace category, and the homology graph of an HDA. We establish conditions under which cube collapses yield topological abstractions of HDAs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"Published online before print November 20, 2015"

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As Escalas de Avaliação da Interação Mãe-Bebé constituem a versão portuguesa das Interaction Rating Scales, propostas por Field (1980), e têm por objetivo avaliar a interação mãe-bebé, aos 3 meses de idade do bebé. As Escalas de Avaliação da Interação Mãe-Bebé foram administradas a 51 díades mãe-bebé aos 3, 6 e 12 meses pós-parto. A versão portuguesa das escalas mostrou elevados índices de consistência interna – Alfa de Cronbach 0,85 (IRSff bebé), 0,91 (IRSff mãe), 0,87 (IRSal bebé), 0,82 (IRSal mãe), assim como elevada fidelidade e validade concorrente e preditiva. As Escalas de Avaliação da Interação Mãe-Bebé assume-se, assim, como um instrumento robusto na avaliação da interação mãe-bebé, na situação de interação face-a-face e na situação de interação alimentar, podendo ser utilizadas em diferentes amostras e contextos, clínicos e de investigação.