961 resultados para Three layer integration


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes advances in ground-based thermodynamic profiling of the lower troposphere through sensor synergy. The well-documented integrated profiling technique (IPT), which uses a microwave profiler, a cloud radar, and a ceilometer to simultaneously retrieve vertical profiles of temperature, humidity, and liquid water content (LWC) of nonprecipitating clouds, is further developed toward an enhanced performance in the boundary layer and lower troposphere. For a more accurate temperature profile, this is accomplished by including an elevation scanning measurement modus of the microwave profiler. Height-dependent RMS accuracies of temperature (humidity) ranging from 0.3 to 0.9 K (0.5–0.8 g m−3) in the boundary layer are derived from retrieval simulations and confirmed experimentally with measurements at distinct heights taken during the 2005 International Lindenberg Campaign for Assessment of Humidity and Cloud Profiling Systems and its Impact on High-Resolution Modeling (LAUNCH) of the German Weather Service. Temperature inversions, especially of the lower boundary layer, are captured in a very satisfactory way by using the elevation scanning mode. To improve the quality of liquid water content measurements in clouds the authors incorporate a sophisticated target classification scheme developed within the European cloud observing network CloudNet. It allows the detailed discrimination between different types of backscatterers detected by cloud radar and ceilometer. Finally, to allow IPT application also to drizzling cases, an LWC profiling method is integrated. This technique classifies the detected hydrometeors into three different size classes using certain thresholds determined by radar reflectivity and/or ceilometer extinction profiles. By inclusion into IPT, the retrieved profiles are made consistent with the measurements of the microwave profiler and an LWC a priori profile. Results of IPT application to 13 days of the LAUNCH campaign are analyzed, and the importance of integrated profiling for model evaluation is underlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A process for preparing three-layer piezoelectrets from fluorinated ethylene-propylene (FEP) copolymer films is introduced. Samples are made from commercial FEP films by means of laser cutting, laser bonding, electrode evaporation, and high-field poling. The observed dielectric-resonance spectra demonstrate the piezoelectricity of the FEP sandwiches. Piezoelectric d (33) coefficients up to a few hundred pC/N are achieved. Charging at elevated temperatures can increase the thermal stability of the piezoelectrets. Isothermal experiments for approximately 15 min demonstrate that samples charged at 140A degrees C keep their piezoelectric activity up to at least 120A degrees C and retain 70% of their initial d (33) even at 130A degrees C. Acoustical measurements show a relatively flat frequency response in the range between 300 Hz and 20 kHz.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concrete offshore platforms, which are subjected a several loading combinations and, thus, requires an analysis more generic possible, can be designed using the concepts adopted to shell elements, but the resistance must be verify in particular cross-sections to shear forces. This work about design of shell elements will be make using the three-layer shell theory. The elements are subject to combined loading of membrane and plate, totalizing eight components of internal forces, which are three membrane forces, three moments (two out-of-plane bending moments and one in-plane, or torsion, moment) and two shear forces. The design method adopted, utilizing the iterative process proposed by Lourenco & Figueiras (1993) obtained from equations of equilibrium developed by Gupta (1896) , will be compared to results of experimentally tested shell elements found in the literature using the program DIANA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To assess the ability of a three-layer graft in the closuse of large fetal skin defects. Methods: Ovine fetuses underwent a large (4 x 3 cm) full-thickness skin defect over the lumbar region at 105 days` gestation (term = 140 days). A bilaminar artificial skin was placed over a cellulose interface to cover the defect (3-layer graft). The skin was partially reapproximated with a continuous nylon suture. Pregnancy was allowed to continue and the surgical site was submitted to histopathological analysis at different post-operative intervals. Results: Seven fetuses underwent surgery. One maternal/fetal death occurred, and the remaining 6 fetuses were analyzed. Artificial skin adherence to the wound edges was observed in cases that remained in utero for at least 15 days. Neoskin was present beneath the silicone layer of the bilaminar artificial skin. Conclusions: Our study shows that neoskin can develop in the fetus using a 3-layer graft, including epidermal growth beneath the silicone layer of the bilaminar skin graft. These findings suggest that the fetus is able to reepithelialise even large skin defects. Further experience is necessary to assess the quality of this repair.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This file contains the complete ontology (OntoProcEDUOC_OKI_Final.owl). At loading time to edit, the OKI ontology corresponding to the implementation level (OntoOKI_DEFINITIVA.owl)must be imported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concrete offshore platforms, which are subjected a several loading combinations and, thus, requires an analysis more generic possible, can be designed using the concepts adopted to shell elements, but the resistance must be verify in particular cross-sections to shear forces. This work about design of shell elements will be make using the three-layer shell theory. The elements are subject to combined loading of membrane and plate, totalizing eight components of internal forces, which are three membrane forces, three moments (two out-of-plane bending moments and one in-plane, or torsion, moment) and two shear forces. The design method adopted, utilizing the iterative process proposed by Lourenco & Figueiras (1993) obtained from equations of equilibrium developed by Gupta (1896) , will be compared to results of experimentally tested shell elements found in the literature using the program DIANA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Here, we study the stable integration of real time optimization (RTO) with model predictive control (MPC) in a three layer structure. The intermediate layer is a quadratic programming whose objective is to compute reachable targets to the MPC layer that lie at the minimum distance to the optimum set points that are produced by the RTO layer. The lower layer is an infinite horizon MPC with guaranteed stability with additional constraints that force the feasibility and convergence of the target calculation layer. It is also considered the case in which there is polytopic uncertainty in the steady state model considered in the target calculation. The dynamic part of the MPC model is also considered unknown but it is assumed to be represented by one of the models of a discrete set of models. The efficiency of the methods presented here is illustrated with the simulation of a low order system. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Three-Layer distributed mediation architecture, designed by Secure System Architecture laboratory, employed a layered framework of presence, integration, and homogenization mediators. The architecture does not have any central component that may affect the system reliability. A distributed search technique was adapted in the system to increase its reliability. An Enhanced Chord-like algorithm (E-Chord) was designed and deployed in the integration layer. The E-Chord is a skip-list algorithm based on Distributed Hash Table (DHT) which is a distributed but structured architecture. DHT is distributed in the sense that no central unit is required to maintain indexes, and it is structured in the sense that indexes are distributed over the nodes in a systematic manner. Each node maintains three kind of routing information: a frequency list, a successor/predecessor list, and a finger table. None of the nodes in the system maintains all indexes, and each node knows about some other nodes in the system. These nodes, also called composer mediators, were connected in a P2P fashion. ^ A special composer mediator called a global mediator initiates the keyword-based matching decomposition of the request using the E-Chord. It generates an Integrated Data Structure Graph (IDSG) on the fly, creates association and dependency relations between nodes in the IDSG, and then generates a Global IDSG (GIDSG). The GIDSG graph is a plan which guides the global mediator how to integrate data. It is also used to stream data from the mediators in the homogenization layer which connected to the data sources. The connectors start sending the data to the global mediator just after the global mediator creates the GIDSG and just before the global mediator sends the answer to the presence mediator. Using the E-Chord and GIDSG made the mediation system more scalable than using a central global schema repository since all the composers in the integration layer are capable of handling and routing requests. Also, when a composer fails, it would only minimally affect the entire mediation system. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper analyses the integration process that firms follow toimplement Supply Chain Management (SCM). This study has beeninspired in the integration model proposed by Stevens (1989). Hesuggests that companies internally integrate first and then extendintegration to other supply chain members, such as customers andsuppliers.To analyse the integration process a survey was conducted amongSpanish food manufacturers. The results show that there are companiesin three different integration stages. In stage I, companies are notintegrated. In stage II, companies have a medium-high level of internalintegration in the Logistics-Production interface, a low level ofinternal integration in the Logistics-Marketing interface, and a mediumlevel of external integration. And, in stage III, companies have highlevels of integration in both internal interfaces and in some of theirsupply chain relationships.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dataset present result from the DFG- funded Arctic-Turbulence-Experiment (ARCTEX-2006) performed by the University of Bayreuth on the island of Svalbard, Norway, during the winter/spring transition 2006. From May 5 to May 19, 2006 turbulent flux and meteorological measurements were performed on the monitoring field near Ny-Ålesund, at 78°55'24'' N, 11°55'15'' E Kongsfjord, Svalbard (Spitsbergen), Norway. The ARCTEX-2006 campaign site was located about 200 m southeast of the settlement on flat snow covered tundra, 11 m to 14 m above sea level. The permanent sites used for this study consisted of the 10 m meteorological tower of the Alfred Wegener Institute for Polar- and Marine Research (AWI), the international standardized radiation measurement site of the Baseline Surface Radiation Network (BSRN), the radiosonde launch site and the AWI tethered balloon launch sites. The temporary sites - set up by the University of Bayreuth - were a 6 m meteorological gradient tower, an eddy-flux measurement complex (EF), and a laser-scintillometer section (SLS). A quality assessment and data correction was applied to detect and eliminate specific measurement errors common at a high arctic landscape. In addition, the quality checked sensible heat flux measurements are compared with bulk aerodynamic formulas that are widely used in atmosphere-ocean/land-ice models for polar regions as described in Ebert and Curry (1993, doi:10.1029/93JC00656) and Launiainen and Cheng (1995). These parameterization approaches easily allow estimation of the turbulent surface fluxes from routine meteorological measurements. The data show: - the role of the intermittency of the turbulent atmospheric fluctuation of momentum and scalars, - the existence of a disturbed vertical temperature profile (sharp inversion layer) close to the surface, - the relevance of possible free convection events for the snow or ice melt in the Arctic spring at Svalbard, and - the relevance of meso-scale atmospheric circulation pattern and air-mass advection for the near-surface turbulent heat exchange in the Arctic spring at Svalbard. Recommendations and improvements regarding the interpretation of eddy-flux and laser-scintillometer data as well as the arrangement of the instrumentation under polar distinct exchange conditions and (extreme) weather situations could be derived.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Book Subtitle International Conference, CENTERIS 2010, Viana do Castelo, Portugal, October 20-22, 2010, Proceedings, Part II

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Informática

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Veurem el desenvolupament del projecte pas a pas, des de l¿estudi dels marcs de treballs més importants que es poden incorporar en projectes J2EE, passant per un anàlisi i disseny acurat, fins arribar a la implementació dels mòduls bàsics que inclouria el sistema de gestió dels centres intentant aprofitar totes les avantatges que ens ofereixen els marcs de treball més adients i tecnologies de darrera generació con AJAX per a fer un sistema flexible i robust capaç d¿assolir totes les necessitats de gestió de la informació dels centres. També veurem com apliquen diversos patrons en aquesta arquitectura client-servidor de tres capes aconseguint, entre altres aspectes, que cada component s¿assigni a una capa a un cert nivell d¿abstracció.