997 resultados para Open-edge guarding
Resumo:
Evolução dos recursos computacionais e impacto das tecnologias open source na Embrapa Monitoramento por Satélite
Resumo:
Many problems in early vision are ill posed. Edge detection is a typical example. This paper applies regularization techniques to the problem of edge detection. We derive an optimal filter for edge detection with a size controlled by the regularization parameter $\\ lambda $ and compare it to the Gaussian filter. A formula relating the signal-to-noise ratio to the parameter $\\lambda $ is derived from regularization analysis for the case of small values of $\\lambda$. We also discuss the method of Generalized Cross Validation for obtaining the optimal filter scale. Finally, we use our framework to explain two perceptual phenomena: coarsely quantized images becoming recognizable by either blurring or adding noise.
Resumo:
Information representation is a critical issue in machine vision. The representation strategy in the primitive stages of a vision system has enormous implications for the performance in subsequent stages. Existing feature extraction paradigms, like edge detection, provide sparse and unreliable representations of the image information. In this thesis, we propose a novel feature extraction paradigm. The features consist of salient, simple parts of regions bounded by zero-crossings. The features are dense, stable, and robust. The primary advantage of the features is that they have abstract geometric attributes pertaining to their size and shape. To demonstrate the utility of the feature extraction paradigm, we apply it to passive navigation. We argue that the paradigm is applicable to other early vision problems.
Resumo:
This report describes the implementation of a theory of edge detection, proposed by Marr and Hildreth (1979). According to this theory, the image is first processed independently through a set of different size filters, whose shape is the Laplacian of a Gaussian, ***. Zero-crossings in the output of these filters mark the positions of intensity changes at different resolutions. Information about these zero-crossings is then used for deriving a full symbolic description of changes in intensity in the image, called the raw primal sketch. The theory is closely tied with early processing in the human visual systems. In this report, we first examine the critical properties of the initial filters used in the edge detection process, both from a theoretical and practical standpoint. The implementation is then used as a test bed for exploring aspects of the human visual system; in particular, acuity and hyperacuity. Finally, we present some preliminary results concerning the relationship between zero-crossings detected at different resolutions, and some observations relevant to the process by which the human visual system integrates descriptions of intensity changes obtained at different resolutions.
Resumo:
BackgroundAnterior open bite occurs when there is a lack of vertical overlap of the upper and lower incisors. the aetiology is multifactorial including: oral habits, unfavourable growth patterns, enlarged lymphatic tissue with mouth breathing. Several treatments have been proposed to correct this malocclusion, but interventions are not supported by strong scientific evidence.ObjectivesThe aim of this systematic review was to evaluate orthodontic and orthopaedic treatments to correct anterior open bite in children.Search methodsThe following databases were searched: the Cochrane Oral Health Group's Trials Register (to 14 February 2014); the Cochrane Central Register of Controlled Trials (CENTRAL)(The Cochrane Library 2014, Issue 1); MEDLINE via OVID (1946 to 14 February 2014); EMBASE via OVID (1980 to 14 February 2014); LILACS via BIREME Virtual Health Library (1982 to 14 February 2014); BBO via BIREME Virtual Health Library (1980 to 14 February 2014); and SciELO (1997 to 14 February 2014). We searched for ongoing trials via ClinicalTrials.gov (to 14 February 2014). Chinese journals were handsearched and the bibliographies of papers were retrieved.Selection criteriaAll randomised or quasi-randomised controlled trials of orthodontic or orthopaedic treatments or both to correct anterior open bite in children.Data collection and analysisTwo review authors independently assessed the eligibility of all reports identified.Risk ratios (RRs) and corresponding 95% confidence intervals (CIs) were calculated for dichotomous data. the continuous data were expressed as described by the author.Main resultsThree randomised controlled trials were included comparing: effects of Frankel's function regulator-4 (FR-4) with lip-seal training versus no treatment; repelling-magnet splints versus bite-blocks; and palatal crib associated with high-pull chincup versus no treatment.The study comparing repelling-magnet splints versus bite-blocks could not be analysed because the authors interrupted the treatment earlier than planned due to side effects in four of ten patients.FR-4 associated with lip-seal training (RR = 0.02 (95% CI 0.00 to 0.38)) and removable palatal crib associated with high-pull chincup (RR = 0.23 (95% CI 0.11 to 0.48)) were able to correct anterior open bite.No study described: randomisation process, sample size calculation, there was not blinding in the cephalometric analysis and the two studies evaluated two interventions at the same time. These results should be therefore viewed with caution.Authors' conclusionsThere is weak evidence that the interventions FR-4 with lip-seal training and palatal crib associated with high-pull chincup are able to correct anterior open bite. Given that the trials included have potential bias, these results must be viewed with caution. Recommendations for clinical practice cannot be made based only on the results of these trials. More randomised controlled trials are needed to elucidate the interventions for treating anterior open bite.
Resumo:
Gough, John, (2004) 'Holevo-Ordering and the Continuous-Time Limit for Open Floquet Dynamics', Letters in Mathematical Physcis 67(3) pp.207-221 RAE2008
Resumo:
Poolton, Nigel; Towlson, B.M.; Hamilton, B.; Evans, D.A., (2006) 'Synchrotron-laser interactions in hexagonal boron nitride: an examination of charge trapping dynamics at the boron K-edge', New Journal of Physics 8 pp.76 RAE2008
Resumo:
Sermon preached at Boston University School of Theology during Wednesday Chapel on October 24, 2007
Resumo:
The resolution passed by the BU University Council approving an initiative to establish an archive of the research and scholarship produced by the faculty of the University.
Resumo:
The Internet has brought unparalleled opportunities for expanding availability of research by bringing down economic and physical barriers to sharing. The digitally networked environment promises to democratize access, carry knowledge beyond traditional research niches, accelerate discovery, encourage new and interdisciplinary approaches to ever more complex research challenges, and enable new computational research strategies. However, despite these opportunities for increasing access to knowledge, the prices of scholarly journals have risen sharply over the past two decades, often forcing libraries to cancel subscriptions. Today even the wealthiest institutions cannot afford to sustain all of the journals needed by their faculties and students. To take advantage of the opportunities created by the Internet and to further their mission of creating, preserving, and disseminating knowledge, many academic institutions are taking steps to capture the benefits of more open research sharing. Colleges and universities have built digital repositories to preserve and distribute faculty scholarly articles and other research outputs. Many individual authors have taken steps to retain the rights they need, under copyright law, to allow their work to be made freely available on the Internet and in their institutionâ s repository. And, faculties at some institutions have adopted resolutions endorsing more open access to scholarly articles. Most recently, on February 12, 2008, the Faculty of Arts and Sciences (FAS) at Harvard University took a landmark step. The faculty voted to adopt a policy requiring that faculty authors send an electronic copy of their scholarly articles to the universityâ s digital repository and that faculty authors automatically grant copyright permission to the university to archive and to distribute these articles unless a faculty member has waived the policy for a particular article. Essentially, the faculty voted to make open access to the results of their published journal articles the default policy for the Faculty of Arts and Sciences of Harvard University. As of March 2008, a proposal is also under consideration in the University of California system by which faculty authors would commit routinely to grant copyright permission to the university to make copies of the facultyâ s scholarly work openly accessible over the Internet. Inspired by the example set by the Harvard faculty, this White Paper is addressed to the faculty and administrators of academic institutions who support equitable access to scholarly research and knowledge, and who believe that the institution can play an important role as steward of the scholarly literature produced by its faculty. This paper discusses both the motivation and the process for establishing a binding institutional policy that automatically grants a copyright license from each faculty member to permit deposit of his or her peer-reviewed scholarly articles in institutional repositories, from which the works become available for others to read and cite.
Resumo:
A working paper written for Boston University Libraries to foster discussion about how to provide better support for BU faculty authors.
Resumo:
We discuss the design principles of TCP within the context of heterogeneous wired/wireless networks and mobile networking. We identify three shortcomings in TCP's behavior: (i) the protocol's error detection mechanism, which does not distinguish different types of errors and thus does not suffice for heterogeneous wired/wireless environments, (ii) the error recovery, which is not responsive to the distinctive characteristics of wireless networks such as transient or burst errors due to handoffs and fading channels, and (iii) the protocol strategy, which does not control the tradeoff between performance measures such as goodput and energy consumption, and often entails a wasteful effort of retransmission and energy expenditure. We discuss a solution-framework based on selected research proposals and the associated evaluation criteria for the suggested modifications. We highlight an important angle that did not attract the required attention so far: the need for new performance metrics, appropriate for evaluating the impact of protocol strategies on battery-powered devices.
Resumo:
As new multi-party edge services are deployed on the Internet, application-layer protocols with complex communication models and event dependencies are increasingly being specified and adopted. To ensure that such protocols (and compositions thereof with existing protocols) do not result in undesirable behaviors (e.g., livelocks) there needs to be a methodology for the automated checking of the "safety" of these protocols. In this paper, we present ingredients of such a methodology. Specifically, we show how SPIN, a tool from the formal systems verification community, can be used to quickly identify problematic behaviors of application-layer protocols with non-trivial communication models—such as HTTP with the addition of the "100 Continue" mechanism. As a case study, we examine several versions of the specification for the Continue mechanism; our experiments mechanically uncovered multi-version interoperability problems, including some which motivated revisions of HTTP/1.1 and some which persist even with the current version of the protocol. One such problem resembles a classic degradation-of-service attack, but can arise between well-meaning peers. We also discuss how the methods we employ can be used to make explicit the requirements for hardening a protocol's implementation against potentially malicious peers, and for verifying an implementation's interoperability with the full range of allowable peer behaviors.