982 resultados para Inside-Outside Algorithm
Resumo:
PURPOSE: To compare the Full Threshold (FT) and SITA Standard (SS) strategies in glaucomatous patients undergoing automated perimetry for the first time. METHODS: Thirty-one glaucomatous patients who had never undergone perimetry underwent automated perimetry (Humphrey, program 30-2) with both FT and SS on the same day, with an interval of at least 15 minutes. The order of the examination was randomized, and only one eye per patient was analyzed. Three analyses were performed: a) all the examinations, regardless of the order of application; b) only the first examinations; c) only the second examinations. In order to calculate the sensitivity of both strategies, the following criteria were used to define abnormality: glaucoma hemifield test (GHT) outside normal limits, pattern standard deviation (PSD) <5%, or a cluster of 3 adjacent points with p<5% at the pattern deviation probability plot. RESULTS: When the results of all examinations were analyzed regardless of the order in which they were performed, the number of depressed points with p<0.5% in the pattern deviation probability map was significantly greater with SS (p=0.037), and the sensitivities were 87.1% for SS and 77.4% for FT (p=0.506). When only the first examinations were compared, there were no statistically significant differences regarding the number of depressed points, but the sensitivity of SS (100%) was significantly greater than that obtained with FT (70.6%) (p=0.048). When only the second examinations were compared, there were no statistically significant differences regarding the number of depressed points, and the sensitivities of SS (76.5%) and FT (85.7%) (p=0.664). CONCLUSION: SS may have a higher sensitivity than FT in glaucomatous patients undergoing automated perimetry for the first time. However, this difference tends to disappear in subsequent examinations.
Resumo:
Consider a single processor and a software system. The software system comprises components and interfaces where each component has an associated interface and each component comprises a set of constrained-deadline sporadic tasks. A scheduling algorithm (called global scheduler) determines at each instant which component is active. The active component uses another scheduling algorithm (called local scheduler) to determine which task is selected for execution on the processor. The interface of a component makes certain information about a component visible to other components; the interfaces of all components are used for schedulability analysis. We address the problem of generating an interface for a component based on the tasks inside the component. We desire to (i) incur only a small loss in schedulability analysis due to the interface and (ii) ensure that the amount of space (counted in bits) of the interface is small; this is because such an interface hides as much details of the component as possible. We present an algorithm for generating such an interface.
Resumo:
The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.
Resumo:
The container loading problem (CLP) is a combinatorial optimization problem for the spatial arrangement of cargo inside containers so as to maximize the usage of space. The algorithms for this problem are of limited practical applicability if real-world constraints are not considered, one of the most important of which is deemed to be stability. This paper addresses static stability, as opposed to dynamic stability, looking at the stability of the cargo during container loading. This paper proposes two algorithms. The first is a static stability algorithm based on static mechanical equilibrium conditions that can be used as a stability evaluation function embedded in CLP algorithms (e.g. constructive heuristics, metaheuristics). The second proposed algorithm is a physical packing sequence algorithm that, given a container loading arrangement, generates the actual sequence by which each box is placed inside the container, considering static stability and loading operation efficiency constraints.
Resumo:
The underground scenarios are one of the most challenging environments for accurate and precise 3d mapping where hostile conditions like absence of Global Positioning Systems, extreme lighting variations and geometrically smooth surfaces may be expected. So far, the state-of-the-art methods in underground modelling remain restricted to environments in which pronounced geometric features are abundant. This limitation is a consequence of the scan matching algorithms used to solve the localization and registration problems. This paper contributes to the expansion of the modelling capabilities to structures characterized by uniform geometry and smooth surfaces, as is the case of road and train tunnels. To achieve that, we combine some state of the art techniques from mobile robotics, and propose a method for 6DOF platform positioning in such scenarios, that is latter used for the environment modelling. A visual monocular Simultaneous Localization and Mapping (MonoSLAM) approach based on the Extended Kalman Filter (EKF), complemented by the introduction of inertial measurements in the prediction step, allows our system to localize himself over long distances, using exclusively sensors carried on board a mobile platform. By feeding the Extended Kalman Filter with inertial data we were able to overcome the major problem related with MonoSLAM implementations, known as scale factor ambiguity. Despite extreme lighting variations, reliable visual features were extracted through the SIFT algorithm, and inserted directly in the EKF mechanism according to the Inverse Depth Parametrization. Through the 1-Point RANSAC (Random Sample Consensus) wrong frame-to-frame feature matches were rejected. The developed method was tested based on a dataset acquired inside a road tunnel and the navigation results compared with a ground truth obtained by post-processing a high grade Inertial Navigation System and L1/L2 RTK-GPS measurements acquired outside the tunnel. Results from the localization strategy are presented and analyzed.
Resumo:
A procura por alternativas ao atual paradigma energético, que se caracteriza por uma predominância indiscutível das fontes combustíveis fósseis, é o motivo primário desta investigação. A energia emitida pelo Sol que chega à Terra diariamente ultrapassa em várias ordens de grandeza a energia que a nossa sociedade atual necessita. O efeito chaminé é uma das formas de aproveitar essa energia. Este efeito tem origem no diferencial de temperaturas existente entre o interior e o exterior de uma chaminé, que provoca um gradiente nas massas volúmicas do fluido entre o interior e o exterior da chaminé, induzindo assim um fluxo de ar. Esta diferença de temperaturas radica na exposição da face exterior da chaminé à radiação solar. No sistema que nos propomos estudar, o ar entra na chaminé por pequenos orifícios situados na sua base, e, ao tomar contacto com as paredes internas da chaminé, aquece desde a temperatura ambiente, Ta, até à temperatura interna, Ti . Este aumento de temperatura torna o ar dentro da chaminé mais “leve” em comparação com o ar mais frio do exterior levando-o a ascender ao longo do interior da chaminé. Este escoamento contém energia cinética que pode, por exemplo, ser transformada em energia elétrica por intermédio de turbinas. A eficiência de conversão da energia será tanto maior quanto menor for a velocidade do ar a jusante da turbina. Esta tecnologia poderá ser instalada de forma descentralizada, como acontece com as atuais centrais concentradoras solares térmicas e fotovoltaicas localizadas na periferia de grandes cidades ou, alternativamente, poderá ser inserida no próprio tecido urbanístico. A investigação demonstra que as dimensões da chaminé, a irradiação e a temperatura do ar são os fatores com maior impacto na potência hidráulica gerada.
Resumo:
Brazil, a country of continental proportions, presents three profiles of malaria transmission. The first and most important numerically, occurs inside the Amazon. The Amazon accounts for approximately 60% of the nation’s territory and approximately 13% of the Brazilian population. This region hosts 99.5% of the nation’s malaria cases, which are predominantly caused by Plasmodium vivax (i.e., 82% of cases in 2013). The second involves imported malaria, which corresponds to malaria cases acquired outside the region where the individuals live or the diagnosis was made. These cases are imported from endemic regions of Brazil (i.e., the Amazon) or from other countries in South and Central America, Africa and Asia. Imported malaria comprised 89% of the cases found outside the area of active transmission in Brazil in 2013. These cases highlight an important question with respect to both therapeutic and epidemiological issues because patients, especially those with falciparum malaria, arriving in a region where the health professionals may not have experience with the clinical manifestations of malaria and its diagnosis could suffer dramatic consequences associated with a potential delay in treatment. Additionally, because the Anopheles vectors exist in most of the country, even a single case of malaria, if not diagnosed and treated immediately, may result in introduced cases, causing outbreaks and even introducing or reintroducing the disease to a non-endemic, receptive region. Cases introduced outside the Amazon usually occur in areas in which malaria was formerly endemic and are transmitted by competent vectors belonging to the subgenus Nyssorhynchus (i.e., Anopheles darlingi, Anopheles aquasalis and species of the Albitarsis complex). The third type of transmission accounts for only 0.05% of all cases and is caused by autochthonous malaria in the Atlantic Forest, located primarily along the southeastern Atlantic Coast. They are caused by parasites that seem to be (or to be very close to) P. vivax and, in a less extent, by Plasmodium malariae and it is transmitted by the bromeliad mosquito Anopheles (Kerteszia) cruzii. This paper deals mainly with the two profiles of malaria found outside the Amazon: the imported and ensuing introduced cases and the autochthonous cases. We also provide an update regarding the situation in Brazil and the Brazilian endemic Amazon.
Resumo:
The objective of this paper was to describe the radiation and energy balance, during the lettuce (Lactuca sativa, L. cv. Verônica) crop cycle inside a polyethylene greenhouse. The radiation and energy balance was made inside a tunnel greenhouse with polyethylene cover (100 mum) and in an external area, both areas with 35 m². Global, reflected and net radiation, soil heat flux and air temperature (dry and humid) were measured during the crop cycle. A Datalogger, which operated at 1 Hz frequency, storing 5 minutes averages was utilized. The global (K¯) and reflected (K) radiations showed that the average transmission of global radiation (K¯in / K¯ex) was almost constant, near to 79.59%, while the average ratio of reflected radiation (Kin / Kex) was 69.21% with 8.47% standard-deviation. The normalized curves of short-wave net radiation, in relation to the global radiation (K*/ K¯), found for both environments, were almost constant at the beginning of cycle; this relation decreased in the final stage of culture. The normalized relation (Rn/ K¯) was bigger in the external area, about 12%, when the green culture covered the soil surface. The long-wave radiation balance average (L*) was bigger outside, about 50%. The energy balance, estimated in terms of vertical fluxes, showed that, for the external area, in average, 83.07% of total net radiation was converted in latent heat evaporation (LE), and 18% in soil heat flux (G), and 9.96% in sensible heat (H), while inside of the greenhouse, 58.71% of total net radiation was converted in LE, 42.68% in H, and 28.79% in G.
Resumo:
This work focuses on the study of the determination on the possibilities of controlling the required moisture within the inside of film sealed packages. The task is based on the challenges faced by fresh food producers in actualizing a longer product shelf-life coupled with the growing complex desires coming from consumers in the aspect of quality. One way to realize this is by proper evaluation on the use of the flexible plastic films through permeation measurements on the required amount of moisture penetrating through the plastic film with the application of microperforation. A packaging material requires proper interaction on moisture transmission, between the product and the outside environment. The plastic film material that stands between, fresh fruits, vegetables and the outside environment could have appropriate respiration rates through possible micro holes. This work simulates similar process with the aid of water vapor transmission rate (WVTR) experiment using anhydrous CaCl2 as the desiccant, in studying the WVTR values of various perforated film materials at different conditions of storage (standard, fridge, and tropical conditions). However, the results showed absorption rates of water vapor at various conditions in grams of H2O/m2/24h.
Resumo:
Understanding the relationship between genetic diseases and the genes associated with them is an important problem regarding human health. The vast amount of data created from a large number of high-throughput experiments performed in the last few years has resulted in an unprecedented growth in computational methods to tackle the disease gene association problem. Nowadays, it is clear that a genetic disease is not a consequence of a defect in a single gene. Instead, the disease phenotype is a reflection of various genetic components interacting in a complex network. In fact, genetic diseases, like any other phenotype, occur as a result of various genes working in sync with each other in a single or several biological module(s). Using a genetic algorithm, our method tries to evolve communities containing the set of potential disease genes likely to be involved in a given genetic disease. Having a set of known disease genes, we first obtain a protein-protein interaction (PPI) network containing all the known disease genes. All the other genes inside the procured PPI network are then considered as candidate disease genes as they lie in the vicinity of the known disease genes in the network. Our method attempts to find communities of potential disease genes strongly working with one another and with the set of known disease genes. As a proof of concept, we tested our approach on 16 breast cancer genes and 15 Parkinson's Disease genes. We obtained comparable or better results than CIPHER, ENDEAVOUR and GPEC, three of the most reliable and frequently used disease-gene ranking frameworks.
Resumo:
This paper presents a parallel Linear Hashtable Motion Estimation Algorithm (LHMEA). Most parallel video compression algorithms focus on Group of Picture (GOP). Based on LHMEA we proposed earlier [1][2], we developed a parallel motion estimation algorithm focus inside of frame. We divide each reference frames into equally sized regions. These regions are going to be processed in parallel to increase the encoding speed significantly. The theory and practice speed up of parallel LHMEA according to the number of PCs in the cluster are compared and discussed. Motion Vectors (MV) are generated from the first-pass LHMEA and used as predictors for second-pass Hexagonal Search (HEXBS) motion estimation, which only searches a small number of Macroblocks (MBs). We evaluated distributed parallel implementation of LHMEA of TPA for real time video compression.
Resumo:
This essay explores how The Truman Show, Peter Weir’s film about a television show, deserves more sustained analysis than it has received since its release in 1998. I will argue that The Truman Show problematizes the binary oppositions of cinema/television, disruption/stability, reality/simulation and outside/inside that structure it. The Truman Show proposes that binary oppositions such as outside/inside exist in a mutually implicating relationship. This deconstructionist strategy not only questions the film’s critical position, but also enables a reflection on the very status of film analysis itself.
Resumo:
This paper analyze and study a pervasive computing system in a mining environment to track people based on RFID (radio frequency identification) technology. In first instance, we explain the RFID fundamentals and the LANDMARC (location identification based on dynamic active RFID calibration) algorithm, then we present the proposed algorithm combining LANDMARC and trilateration technique to collect the coordinates of the people inside the mine, next we generalize a pervasive computing system that can be implemented in mining, and finally we show the results and conclusions.
Resumo:
Figs and fig wasps form a peculiar closed community in which the Ficus tree provides a compact syconium (inflorescence) habitat for the lives of a complex assemblage of Chalcidoid insects. These diverse fig wasp species have intimate ecological relationships within the closed world of the fig syconia. Previous surveys of Wolbachia, maternally inherited endosymbiotic bacteria that infect vast numbers of arthropod hosts, showed that fig wasps have some of the highest known incidences of Wolbachia amongst all insects. We ask whether the evolutionary patterns of Wolbachia sequences in this closed syconium community are different from those in the outside world. In the present study, we sampled all 17 fig wasp species living on Ficus benjamina, covering 4 families, 6 subfamilies, and 8 genera of wasps. We made a thorough survey of Wolbachia infection patterns and studied evolutionary patterns in wsp (Wolbachia Surface Protein) sequences. We find evidence for high infection incidences, frequent recombination between Wolbachia strains, and considerable horizontal transfer, suggesting rapid evolution of Wolbachia sequences within the syconium community. Though the fig wasps have relatively limited contact with outside world, Wolbachia may be introduced to the syconium community via horizontal transmission by fig wasps species that have winged males and visit the syconia earlier.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)