951 resultados para Schema Matching
Resumo:
La Stereo Vision è un popolare argomento di ricerca nel campo della Visione Artificiale; esso consiste nell’usare due immagini di una stessa scena,prodotte da due fotocamere diverse, per estrarre informazioni in 3D. L’idea di base della Stereo Vision è la simulazione della visione binoculare umana:le due fotocamere sono disposte in orizzontale per fungere da “occhi” che guardano la scena in 3D. Confrontando le due immagini ottenute, si possono ottenere informazioni riguardo alle posizioni degli oggetti della scena.In questa relazione presenteremo un algoritmo di Stereo Vision: si tratta di un algoritmo parallelo che ha come obiettivo di tracciare le linee di livello di un area geografica. L’algoritmo in origine era stato implementato per la Connection Machine CM-2, un supercomputer sviluppato negli anni 80, ed era espresso in *Lisp, un linguaggio derivato dal Lisp e ideato per la macchina stessa. Questa relazione tratta anche la traduzione e l’implementazione dell’algoritmo in CUDA, ovvero un’architettura hardware per l’elaborazione pa- rallela sviluppata da NVIDIA, che consente di eseguire codice parallelo su GPU. Si darà inoltre uno sguardo alle difficoltà che sono state riscontrate nella traduzione da *Lisp a CUDA.
Resumo:
Alloimmunisation is a major complication in patients with sickle cell disease (SCD) receiving red blood cell (RBC) transfusions and despite provision of Rh phenotyped RBC units, Rh antibodies still occur. These antibodies in patients positive for the corresponding Rh antigen are considered autoantibodies in many cases but variant RH alleles found in SCD patients can also contribute to Rh alloimmunisation. In this study, we characterised variant RH alleles in 31 SCD patients who made antibodies to Rh antigens despite antigen-positive status and evaluated the clinical significance of the antibodies produced. RHD and RHCE BeadChip™ from BioArray Solutions and/or amplification and sequencing of exons were used to identify the RH variants. The serological features of all Rh antibodies in antigen-positive patients were analysed and the clinical significance of the antibodies was evaluated by retrospective analysis of the haemoglobin (Hb) levels before and after transfusion; the change from baseline pre-transfusion Hb and the percentage of HbS were also determined. We identified variant RH alleles in 31/48 (65%) of SCD patients with Rh antibodies. Molecular analyses revealed the presence of partial RHD alleles and variant RHCE alleles associated with altered C and e antigens. Five patients were compound heterozygotes for RHD and RHCE variants. Retrospective analysis showed that 42% of antibodies produced by the patients with RH variants were involved in delayed haemolytic transfusion reactions or decreased survival of transfused RBC. In this study, we found that Rh antibodies in SCD patients with RH variants can be clinically significant and, therefore, matching patients based on RH variants should be considered.
Resumo:
This paper argues in favor of a concord features valuing within the DP in terms of the Agree operation (Chomsky, 1999), with no recourse to any other mechanism. I show that Agree accounts for feature valuing both in the sentence level as well as in the DP, contrary to Chomsky's (1999) suggestion that concord in DP should involve some other checking mechanism.
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
We introduce the Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS). CATT-BRAMS is an on-line transport model fully consistent with the simulated atmospheric dynamics. Emission sources from biomass burning and urban-industrial-vehicular activities for trace gases and from biomass burning aerosol particles are obtained from several published datasets and remote sensing information. The tracer and aerosol mass concentration prognostics include the effects of sub-grid scale turbulence in the planetary boundary layer, convective transport by shallow and deep moist convection, wet and dry deposition, and plume rise associated with vegetation fires in addition to the grid scale transport. The radiation parameterization takes into account the interaction between the simulated biomass burning aerosol particles and short and long wave radiation. The atmospheric model BRAMS is based on the Regional Atmospheric Modeling System (RAMS), with several improvements associated with cumulus convection representation, soil moisture initialization and surface scheme tuned for the tropics, among others. In this paper the CATT-BRAMS model is used to simulate carbon monoxide and particulate material (PM(2.5)) surface fluxes and atmospheric transport during the 2002 LBA field campaigns, conducted during the transition from the dry to wet season in the southwest Amazon Basin. Model evaluation is addressed with comparisons between model results and near surface, radiosondes and airborne measurements performed during the field campaign, as well as remote sensing derived products. We show the matching of emissions strengths to observed carbon monoxide in the LBA campaign. A relatively good comparison to the MOPITT data, in spite of the fact that MOPITT a priori assumptions imply several difficulties, is also obtained.
Resumo:
We adopt the Dirac model for graphene and calculate the Casimir interaction energy between a plane suspended graphene sample and a parallel plane perfect conductor. This is done in two ways. First, we use the quantum-field-theory approach and evaluate the leading-order diagram in a theory with 2+1-dimensional fermions interacting with 3+1-dimensional photons. Next, we consider an effective theory for the electromagnetic field with matching conditions induced by quantum quasiparticles in graphene. The first approach turns out to be the leading order in the coupling constant of the second one. The Casimir interaction for this system appears to be rather weak. It exhibits a strong dependence on the mass of the quasiparticles in graphene.
Resumo:
An (n, d)-expander is a graph G = (V, E) such that for every X subset of V with vertical bar X vertical bar <= 2n - 2 we have vertical bar Gamma(G)(X) vertical bar >= (d + 1) vertical bar X vertical bar. A tree T is small if it has at most n vertices and has maximum degree at most d. Friedman and Pippenger (1987) proved that any ( n; d)- expander contains every small tree. However, their elegant proof does not seem to yield an efficient algorithm for obtaining the tree. In this paper, we give an alternative result that does admit a polynomial time algorithm for finding the immersion of any small tree in subgraphs G of (N, D, lambda)-graphs Lambda, as long as G contains a positive fraction of the edges of Lambda and lambda/D is small enough. In several applications of the Friedman-Pippenger theorem, including the ones in the original paper of those authors, the (n, d)-expander G is a subgraph of an (N, D, lambda)-graph as above. Therefore, our result suffices to provide efficient algorithms for such previously non-constructive applications. As an example, we discuss a recent result of Alon, Krivelevich, and Sudakov (2007) concerning embedding nearly spanning bounded degree trees, the proof of which makes use of the Friedman-Pippenger theorem. We shall also show a construction inspired on Wigderson-Zuckerman expander graphs for which any sufficiently dense subgraph contains all trees of sizes and maximum degrees achieving essentially optimal parameters. Our algorithmic approach is based on a reduction of the tree embedding problem to a certain on-line matching problem for bipartite graphs, solved by Aggarwal et al. (1996).
Resumo:
Nunes, JA, Crewther, BT, Ugrinowitsch, C, Tricoli, V, Viveiros, L, de Rose Jr, D, and Aoki, MS. Salivary hormone and immune responses to three resistance exercise schemes in elite female athletes J Strength Cond Res 25(8): 2322-2327, 2011-This study examined the salivary hormone and immune responses of elite female athletes to 3 different resistance exercise schemes. Fourteen female basketball players each performed an endurance scheme (ES-4 sets of 12 reps, 60% of 1 repetition maximum (1RM) load, 1-minute rest periods), a strength-hypertrophy scheme (SHS-1 set of 5RM, 1 set of 4RM, 1 set of 3RM, 1 set of 2RM, and 1set of 1RM with 3-minute rest periods, followed by 3 sets of 10RM with 2-minute rest periods) and a power scheme (PS-3 sets of 10 reps, 50% 1RM load, 3-minute rest periods) using the same exercises (bench press, squat, and biceps curl). Saliva samples were collected at 07:30 hours, pre-exercise (Pre) at 09:30 hours, postexercise (Post), and at 17:30 hours. Matching samples were also taken on a nonexercising control day. The samples were analyzed for testosterone, cortisol (C), and immunoglobulin A concentrations. The total volume of load lifted differed among the 3 schemes (SHS > ES > PS, p < 0.05). Postexercise C concentrations increased after all schemes, compared to control values (p < 0.05). In the SHS, the postexercise C response was also greater than pre-exercise data (p < 0.05). The current findings confirm that high-volume resistance exercise schemes can stimulate greater C secretion because of higher metabolic demand. In terms of practical applications, acute changes in C may be used to evaluate the metabolic demands of different resistance exercise schemes, or as a tool for monitoring training strain.
Resumo:
This study analyzed inter-individual variability of the temporal structure applied in basketball throwing. Ten experienced male athletes in basketball throwing were filmed and a number of kinematic movement parameters analyzed. A biomechanical model provided the relative timing of the shoulder, elbow and wrist joint movements. Inter-individual variability was analyzed using sequencing and relative timing of tem phases of the throw. To compare the variability of the movement phases between subjects a discriminant analysis and an ANOVA were applied. The Tukey test was applied to determine where differences occurred. The significance level was p = 0.05. Inter-individual variability was explained by three concomitant factors: (a) a precision control strategy, (b) a velocity control strategy and (c) intrinsic characteristics of the subjects. Therefore, despite the fact that some actions are common to the basketball throwing pattern each performed demonstrated particular and individual characteristics.
Resumo:
Support for interoperability and interchangeability of software components which are part of a fieldbus automation system relies on the definition of open architectures, most of them involving proprietary technologies. Concurrently, standard, open and non-proprietary technologies, such as XML, SOAP, Web Services and the like, have greatly evolved and been diffused in the computing area. This article presents a FOUNDATION fieldbus (TM) device description technology named Open-EDD, based on XML and other related technologies (XLST, DOM using Xerces implementation, OO, XMIL Schema), proposing an open and nonproprietary alternative to the EDD (Electronic Device Description). This initial proposal includes defining Open-EDDML as the programming language of the technology in the FOUNDATION fieldbus (TM) protocol, implementing a compiler and a parser, and finally, integrating and testing the new technology using field devices and a commercial fieldbus configurator. This study attests that this new technology is feasible and can be applied to other configurators or HMI applications used in fieldbus automation systems. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Nowadays, there is a trend for industry reorganization in geographically dispersed systems, carried out of their activities with autonomy. These systems must maintain coordinated relationship among themselves in order to assure an expected performance of the overall system. Thus, a manufacturing system is proposed, based on ""web services"" to assure an effective orchestration of services in order to produce final products. In addition, it considers special functions, such as teleoperation and remote monitoring, users` online request, among others. Considering the proposed system as discrete event system (DES), techniques derived from Petri nets (PN), including the Production Flow Schema (PFS), can be used in a PFS/PN approach for modeling. The system is approached in different levels of abstraction: a conceptual model which is obtained by applying the PFS technique and a functional model which is obtained by applying PN. Finally, a particular example of the proposed system is presented.
Resumo:
The application of functionally graded material (FGM) concept to piezoelectric transducers allows the design of composite transducers without interfaces, due to the continuous change of property values. Thus, large improvements can be achieved, as reduction of stress concentration, increasing of bonding strength, and bandwidth. This work proposes to design and to model FGM piezoelectric transducers and to compare their performance with non-FGM ones. Analytical and finite element (FE) modeling of FGM piezoelectric transducers radiating a plane pressure wave in fluid medium are developed and their results are compared. The ANSYS software is used for the FE modeling. The analytical model is based on FGM-equivalent acoustic transmission-line model, which is implemented using MATLAB software. Two cases are considered: (i) the transducer emits a pressure wave in water and it is composed of a graded piezoceramic disk, and backing and matching layers made of homogeneous materials; (ii) the transducer has no backing and matching layer; in this case, no external load is simulated. Time and frequency pressure responses are obtained through a transient analysis. The material properties are graded along thickness direction. Linear and exponential gradation functions are implemented to illustrate the influence of gradation on the transducer pressure response, electrical impedance, and resonance frequencies. (C) 2009 Elsevier B. V. All rights reserved.
Resumo:
Solid-liquid phase equilibrium modeling of triacylglycerol mixtures is essential for lipids design. Considering the alpha polymorphism and liquid phase as ideal, the Margules 2-suffix excess Gibbs energy model with predictive binary parameter correlations describes the non ideal beta and beta` solid polymorphs. Solving by direct optimization of the Gibbs free energy enables one to predict from a bulk mixture composition the phases composition at a given temperature and thus the SFC curve, the melting profile and the Differential Scanning Calorimetry (DSC) curve that are related to end-user lipid properties. Phase diagram, SFC and DSC curve experimental data are qualitatively and quantitatively well predicted for the binary mixture 1,3-dipalmitoyl-2-oleoyl-sn-glycerol (POP) and 1,2,3-tripalmitoyl-sn-glycerol (PPP), the ternary mixture 1,3-dimyristoyl-2-palmitoyl-sn-glycerol (MPM), 1,2-distearoyl-3-oleoyl-sn-glycerol (SSO) and 1,2,3-trioleoyl-sn-glycerol (OOO), for palm oil and cocoa butter. Then, addition to palm oil of Medium-Long-Medium type structured lipids is evaluated, using caprylic acid as medium chain and long chain fatty acids (EPA-eicosapentaenoic acid, DHA-docosahexaenoic acid, gamma-linolenic-octadecatrienoic acid and AA-arachidonic acid), as sn-2 substitutes. EPA, DHA and AA increase the melting range on both the fusion and crystallization side. gamma-linolenic shifts the melting range upwards. This predictive tool is useful for the pre-screening of lipids matching desired properties set a priori.