880 resultados para Homography constraint
Resumo:
AIRES, Kelson R. T.; ARAÚJO, Hélder J.; MEDEIROS, Adelardo A. D. Plane Detection Using Affine Homography. In: CONGRESSO BRASILEIRO DE AUTOMÁTICA, 2008, Juiz de Fora, MG: Anais... do CBA 2008.
Resumo:
AIRES, Kelson R. T.; ARAÚJO, Hélder J.; MEDEIROS, Adelardo A. D. Plane Detection Using Affine Homography. In: CONGRESSO BRASILEIRO DE AUTOMÁTICA, 2008, Juiz de Fora, MG: Anais... do CBA 2008.
Resumo:
This article reports the results of a survey of the pearl oyster industry in French Polynesia territory. Its purpose is to examine the perceptions of the priorities for the development of this industry towards sustainable development. These perceptions were apprehended by a survey of pearl oyster farmers and other stakeholders of the sector (management authorities, scientists). After describing the methodological protocol of these investigations, it comes to confront the priorities chosen by professionals (i.e. pearl farmers) concerning sustainable development, with the perceptions of others stakeholders in the sector. Secondly it comes to build a typology of the priorities of pearl farmers concerning sustainable development. This analysis enables the assessment of the degree of convergence within the sector, which is the base material for defining a shared action plan at the territory scale. This is the first study compiling data of surveys of various professionals and stakeholders of the pearl farming industry in such a large area in French Polynesia.
Resumo:
Context. In February-March 2014, the MAGIC telescopes observed the high-frequency peaked BL Lac 1ES 1011+496 (z=0.212) in flaring state at very-high energy (VHE, E>100GeV). The flux reached a level more than 10 times higher than any previously recorded flaring state of the source. Aims. Description of the characteristics of the flare presenting the light curve and the spectral parameters of the night-wise spectra and the average spectrum of the whole period. From these data we aim at detecting the imprint of the Extragalactic Background Light (EBL) in the VHE spectrum of the source, in order to constrain its intensity in the optical band. Methods. We analyzed the gamma-ray data from the MAGIC telescopes using the standard MAGIC software for the production of the light curve and the spectra. For the constraining of the EBL we implement the method developed by the H.E.S.S. collaboration in which the intrinsic energy spectrum of the source is modeled with a simple function (< 4 parameters), and the EBL-induced optical depth is calculated using a template EBL model. The likelihood of the observed spectrum is then maximized, including a normalization factor for the EBL opacity among the free parameters. Results. The collected data allowed us to describe the flux changes night by night and also to produce di_erential energy spectra for all nights of the observed period. The estimated intrinsic spectra of all the nights could be fitted by power-law functions. Evaluating the changes in the fit parameters we conclude that the spectral shape for most of the nights were compatible, regardless of the flux level, which enabled us to produce an average spectrum from which the EBL imprint could be constrained. The likelihood ratio test shows that the model with an EBL density 1:07 (-0.20,+0.24)stat+sys, relative to the one in the tested EBL template (Domínguez et al. 2011), is preferred at the 4:6 σ level to the no-EBL hypothesis, with the assumption that the intrinsic source spectrum can be modeled as a log-parabola. This would translate into a constraint of the EBL density in the wavelength range [0.24 μm,4.25 μm], with a peak value at 1.4 μm of λF_ = 12:27^(+2:75)_ (-2:29) nW m^(-2) sr^(-1), including systematics.
Resumo:
Projeto de Graduação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Licenciada em Fisioterapia
Resumo:
Declarative techniques such as Constraint Programming can be very effective in modeling and assisting management decisions. We present a method for managing university classrooms which extends the previous design of a Constraint-Informed Information System to generate the timetables while dealing with spatial resource optimization issues. We seek to maximize space utilization along two dimensions: classroom use and occupancy rates. While we want to maximize the room use rate, we still need to satisfy the soft constraints which model students’ and lecturers’ preferences. We present a constraint logic programming-based local search method which relies on an evaluation function that combines room utilization and timetable soft preferences. Based on this, we developed a tool which we applied to the improvement of classroom allocation in a University. Comparing the results to the current timetables obtained without optimizing space utilization, the initial versions of our tool manages to reach a 30% improvement in space utilization, while preserving the quality of the timetable, both for students and lecturers.
Resumo:
Solving a complex Constraint Satisfaction Problem (CSP) is a computationally hard task which may require a considerable amount of time. Parallelism has been applied successfully to the job and there are already many applications capable of harnessing the parallel power of modern CPUs to speed up the solving process. Current Graphics Processing Units (GPUs), containing from a few hundred to a few thousand cores, possess a level of parallelism that surpasses that of CPUs and there are much less applications capable of solving CSPs on GPUs, leaving space for further improvement. This paper describes work in progress in the solving of CSPs on GPUs, CPUs and other devices, such as Intel Many Integrated Cores (MICs), in parallel. It presents the gains obtained when applying more devices to solve some problems and the main challenges that must be faced when using devices with as different architectures as CPUs and GPUs, with a greater focus on how to effectively achieve good load balancing between such heterogeneous devices.
Resumo:
A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.
Resumo:
Background Recentemente la letteratura scientifica ha dimostrato come un corretto controllo posturale faciliti i movimenti dell’arto superiore. Ci sono evidenze secondo cui, applicando al paziente dei contenimenti sul tronco, si ha un miglioramento della funzionalità dell’arto superiore. Obiettivi L’obiettivo principale della tesi era quello di verificare come il sostegno del tronco con l’utilizzo di una stabile struttura assiale, attraverso un supporto esterno definito “trunk constraint”, incrementi il controllo posturale, per facilitare i movimenti frazionati degli arti superiori in persone con esiti di patologie neurologiche. Materiali e metodi Il caso clinico riguarda un uomo di 60 anni con esiti di emiparesi sinistra da ictus ischemico destro. E’ stato eseguito un protocollo di dieci sessioni di trattamento, di un’ora ciascuna, in cui veniva applicata la facilitazione attraverso trunk constraint in diversi setting riabilitativi. I dati sono stati raccolti tramite le scale: Trunk Control Test, Trunk Impairment Scale e Fugl-Meyer Assessment. Inoltre, è stata eseguita l’analisi osservazionale, attraverso videoripresa, di un gesto funzionale dell’arto superiore. Risultati I dati rilevati dimostrano degli effetti positivi rispetto alle ipotesi di partenza. Infatti sono stati riscontrati miglioramenti negli item delle scale somministrate e nella valutazione qualitativa dell’arto superiore. In particolare, si è evidenziato un miglioramento nel controllo del tronco nella scala Trunk Control Test e nella Trunk Impairment Scale e della funzione dell’arto superiore alla scala Fugl-Meyer Assessment. L’analisi osservazionale dei video dimostra un miglioramento del timing di attivazione durante la fase di reaching. Conclusioni I risultati ottenuti supportano il fatto che un incremento dell’attività antigravitaria del tronco, anche attraverso supporti esterni come la trunk constraint, possono facilitare un miglioramento funzionale dell’arto superiore.
Resumo:
Combinatorial decision and optimization problems belong to numerous applications, such as logistics and scheduling, and can be solved with various approaches. Boolean Satisfiability and Constraint Programming solvers are some of the most used ones and their performance is significantly influenced by the model chosen to represent a given problem. This has led to the study of model reformulation methods, one of which is tabulation, that consists in rewriting the expression of a constraint in terms of a table constraint. To apply it, one should identify which constraints can help and which can hinder the solving process. So far this has been performed by hand, for example in MiniZinc, or automatically with manually designed heuristics, in Savile Row. Though, it has been shown that the performances of these heuristics differ across problems and solvers, in some cases helping and in others hindering the solving procedure. However, recent works in the field of combinatorial optimization have shown that Machine Learning (ML) can be increasingly useful in the model reformulation steps. This thesis aims to design a ML approach to identify the instances for which Savile Row’s heuristics should be activated. Additionally, it is possible that the heuristics miss some good tabulation opportunities, so we perform an exploratory analysis for the creation of a ML classifier able to predict whether or not a constraint should be tabulated. The results reached towards the first goal show that a random forest classifier leads to an increase in the performances of 4 different solvers. The experimental results in the second task show that a ML approach could improve the performance of a solver for some problem classes.
Resumo:
The aim of this paper is to discuss some rhythmic differences between European and Brazilian Portuguese and their relationship to pretonic vowel reduction phenomena. After the basic facts of PE and PB are presented, we show that the issue cannot be discussed without taking into account secondary stress placement, and we proceed to present the algorithm-based approach to secondary stress in Portuguese, representative of Metrical Phonology analyses. After showing that this deterministic approach cannot adequately explain the variable position of secondary stress in both languages regarding words with an even number of pretonic syllables, we argue for the interpretation of secondary stress and therefore for the construction of rhythmic units at the PF interface, as suggested in Chomsky s Minimalist Program. We also propose, inspired by the constrain hierarchies as proposed in Optimality Theory, that such interpretation must take into account two different constraint rankings, in EP and BP. These different rankings would ultimately explain the rhythmic differences between both languages, as well as the different behavior of pretonic vowels with respect to reduction processes.
Resumo:
In this Letter, we propose a new and model-independent cosmological test for the distance-duality (DD) relation, eta = D(L)(z)(1 + z)(-2)/D(A)(z) = 1, where D(L) and D(A) are, respectively, the luminosity and angular diameter distances. For D(L) we consider two sub-samples of Type Ia supernovae (SNe Ia) taken from Constitution data whereas D(A) distances are provided by two samples of galaxy clusters compiled by De Filippis et al. and Bonamente et al. by combining Sunyaev-Zeldovich effect and X-ray surface brightness. The SNe Ia redshifts of each sub-sample were carefully chosen to coincide with the ones of the associated galaxy cluster sample (Delta z < 0.005), thereby allowing a direct test of the DD relation. Since for very low redshifts, D(A)(z) approximate to D(L)(z), we have tested the DD relation by assuming that. is a function of the redshift parameterized by two different expressions: eta(z) = 1 + eta(0)z and eta(z) = 1 +eta(0)z/(1 + z), where eta(0) is a constant parameter quantifying a possible departure from the strict validity of the reciprocity relation (eta(0) = 0). In the best scenario (linear parameterization), we obtain eta(0) = -0.28(-0.44)(+0.44) (2 sigma, statistical + systematic errors) for the De Filippis et al. sample (elliptical geometry), a result only marginally compatible with the DD relation. However, for the Bonamente et al. sample (spherical geometry) the constraint is eta(0) = -0.42(-0.34)(+0.34) (3 sigma, statistical + systematic errors), which is clearly incompatible with the duality-distance relation.
Resumo:
The mechanism of incoherent pi(0) and eta photoproduction from complex nuclei is investigated from 4 to 12 GeV with an extended version of the multicollisional Monte Carlo (MCMC) intranuclear cascade model. The calculations take into account the elementary photoproduction amplitudes via a Regge model and the nuclear effects of photon shadowing, Pauli blocking, and meson-nucleus final-state interactions. The results for pi(0) photoproduction reproduced for the first time the magnitude and energy dependence of the measured rations sigma(gamma A)/sigma(gamma N) for several nuclei (Be, C, Al, Cu, Ag, and Pb) from a Cornell experiment. The results for eta photoproduction fitted the inelastic background in Cornell's yields remarkably well, which is clearly not isotropic as previously considered in Cornell's analysis. With this constraint for the background, the eta -> gamma gamma. decay width was extracted using the Primakoff method, combining Be and Cu data [Gamma(eta ->gamma gamma) = 0.476(62) keV] and using Be data only [Gamma(eta ->gamma gamma) = 0.512(90) keV]; where the errors are only statistical. These results are in sharp contrast (similar to 50-60%) with the value reported by the Cornell group [Gamma(eta ->gamma gamma). = 0.324(46) keV] and in line with the Particle Data Group average of 0.510(26) keV.
Resumo:
The double helicity asymmetry in neutral pion production for p(T) = 1 to 12 GeV/c was measured with the PHENIX experiment to access the gluon-spin contribution, Delta G, to the proton spin. Measured asymmetries are consistent with zero, and at a theory scale of mu 2 = 4 GeV(2) a next to leading order QCD analysis gives Delta G([0.02,0.3]) = 0.2, with a constraint of -0.7 < Delta G([0.02,0.3]) < 0.5 at Delta chi(2) = 9 (similar to 3 sigma) for the sampled gluon momentum fraction (x) range, 0.02 to 0.3. The results are obtained using predictions for the measured asymmetries generated from four representative fits to polarized deep inelastic scattering data. We also consider the dependence of the Delta G constraint on the choice of the theoretical scale, a dominant uncertainty in these predictions.