11 resultados para Telemetry of process variables
em Universidade do Minho
Resumo:
When representing the requirements for an intended software solution during the development process, a logical architecture is a model that provides an organized vision of how functionalities behave regardless of the technologies to be implemented. If the logical architecture represents an ambient assisted living (AAL) ecosystem, such representation is a complex task due to the existence of interrelated multidomains, which, most of the time, results in incomplete and incoherent user requirements. In this chap- ter, we present the results obtained when applying process-level modeling techniques to the derivation of the logical architecture for a real industrial AAL project. We adopt a V-Model–based approach that expresses the AAL requirements in a process-level perspec- tive, instead of the traditional product-level view. Additionally, we ensure compliance of the derived logical architecture with the National Institute of Standards and Technology (NIST) reference architecture as nonfunctional requirements to support the implementa- tion of the AAL architecture in cloud contexts.
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
Glazing is a technique used to retard fish deterioration during storage. This work focuses on the study of distinct variables (fish temperature, coating temperature, dipping time) that affect the thickness of edible coatings (water glazing and 1.5% chitosan) applied on frozen fish. Samples of frozen Atlantic salmon (Salmo salar) at -15, -20, and -25 °C were either glazed with water at 0.5, 1.5 or 2.5 °C or coated with 1.5% chitosan solution at 2.5, 5 or 8 °C, by dipping during 10 to 60 s. For both water and chitosan coatings, lowering the salmon and coating solution temperatures resulted in an increase of coating thickness. At the same conditions, higher thickness values were obtained when using chitosan (max. thickness of 1.41±0.05 mm) compared to water (max. thickness of 0.84±0.03 mm). Freezing temperature and crystallization heat were found to be lower for 1.5% chitosan solution than for water, thus favoring phase change. Salmon temperature profiles allowed determining, for different dipping conditions, whether the salmon temperature was within food safety standards to prevent the growth of pathogenic microorganisms. The concept of safe dipping time is proposed to define how long a frozen product can be dipped into a solution without the temperature raising to a point where it can constitute a hazard.
Resumo:
First published online: December 16, 2014.
Resumo:
During the last few years many research efforts have been done to improve the design of ETL (Extract-Transform-Load) systems. ETL systems are considered very time-consuming, error-prone and complex involving several participants from different knowledge domains. ETL processes are one of the most important components of a data warehousing system that are strongly influenced by the complexity of business requirements, their changing and evolution. These aspects influence not only the structure of a data warehouse but also the structures of the data sources involved with. To minimize the negative impact of such variables, we propose the use of ETL patterns to build specific ETL packages. In this paper, we formalize this approach using BPMN (Business Process Modelling Language) for modelling more conceptual ETL workflows, mapping them to real execution primitives through the use of a domain-specific language that allows for the generation of specific instances that can be executed in an ETL commercial tool.
Resumo:
Various differential cross-sections are measured in top-quark pair (tt¯) events produced in proton--proton collisions at a centre-of-mass energy of s√=7 TeV at the LHC with the ATLAS detector. These differential cross-sections are presented in a data set corresponding to an integrated luminosity of 4.6 fb−1. The differential cross-sections are presented in terms of kinematic variables of a top-quark proxy referred to as the pseudo-top-quark whose dependence on theoretical models is minimal. The pseudo-top-quark can be defined in terms of either reconstructed detector objects or stable particles in an analogous way. The measurements are performed on tt¯ events in the lepton+jets channel, requiring exactly one charged lepton and at least four jets with at least two of them tagged as originating from a b-quark. The hadronic and leptonic pseudo-top-quarks are defined via the leptonic or hadronic decay mode of the W boson produced by the top-quark decay in events with a single charged lepton.The cross-section is measured as a function of the transverse momentum and rapidity of both the hadronic and leptonic pseudo-top-quark as well as the transverse momentum, rapidity and invariant mass of the pseudo-top-quark pair system. The measurements are corrected for detector effects and are presented within a kinematic range that closely matches the detector acceptance. Differential cross-section measurements of the pseudo-top-quark variables are compared with several Monte Carlo models that implement next-to-leading order or leading-order multi-leg matrix-element calculations.
Resumo:
Excessive accumulation of Long Chain Fatty Acids (LCFA) in methanogenic bioreactors is the cause of process failure associated to a severe decrease in methane production. In particular, fast and persistent accumulation of palmitate is critical and still not elucidated. Aerobes or facultative anaerobes were detected in those reactors, raising new questions on LCFA biodegradation. To get insight into the influence of oxygen, two bioreactors were operated under microaerophilic and anaerobic conditions, with oleate at 1 and 4 gCOD/(L d). Palmitate accumulated up to 2 and 16 gCOD/L in the anaerobic and microaerophilic reactor, respectively, which shows the importance of oxygen in this conversion. A second experiment was designed to understand the dynamics of oleate to palmitate conversion. A CSTR and a PFR were assembled in series and fed with oleate under microaerophilic conditions. HRT from 6 to 24 h were applied in the CSTR, and 14 to 52 min in the PFR. In the PFR a biofilm was formed where palmitate accounted for 82% of total LCFA. Pseudomonas was the predominant genus (42 %) in this biofilm, highlighting the role of aerobic and facultative anaerobic bacteria in LCFA bioconversion.
Resumo:
Dissertação de mestrado em Engenharia Industrial (área de especialização em Qualidade, Segurança e Manutenção)
Resumo:
This data article is referred to the research article entitled The role of ascorbate peroxidase, guaiacol peroxidase, and polysaccharides in cassava (Manihot esculenta Crantz) roots under postharvest physiological deterioration by Uarrota et al. (2015). Food Chemistry 197, Part A, 737746. The stress duo to PPD of cassava roots leads to the formation of ROS which are extremely harmful and accelerates cassava spoiling. To prevent or alleviate injuries from ROS, plants have evolved antioxidant systems that include non-enzymatic and enzymatic defence systems such as ascorbate peroxidase, guaiacol peroxidase and polysaccharides. In this data article can be found a dataset called newdata, in RData format, with 60 observations and 06 variables. The first 02 variables (Samples and Cultivars) and the last 04, spectrophotometric data of ascorbate peroxidase, guaiacol peroxidase, tocopherol, total proteins and arcsined data of cassava PPD scoring. For further interpretation and analysis in R software, a report is also provided. Means of all variables and standard deviations are also provided in the Supplementary tables (data.long3.RData, data.long4.RData and meansEnzymes.RData), raw data of PPD scoring without transformation (PPDmeans.RData) and days of storage (days.RData) are also provided for data analysis reproducibility in R software.
Resumo:
Dissertação de mestrado em Educação da Infância (área de especialização em Supervisão e Pedagogia da Infância)
Resumo:
The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.