11 resultados para continuous biometric authentication system
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Background: This pilot study aimed to verify if glycemic control can be achieved in type 2 diabetes patients after acute myocardial infarction (AMI), using insulin glargine (iGlar) associated with regular insulin (iReg), compared with the standard intensive care unit protocol, which uses continuous insulin intravenous delivery followed by NPH insulin and iReg (St. Care). Patients and Methods: Patients (n = 20) within 24 h of AMI were randomized to iGlar or St. Care. Therapy was guided exclusively by capillary blood glucose (CBG), but glucometric parameters were also analyzed by blinded continuous glucose monitoring system (CGMS). Results: Mean glycemia was 141 +/- 39 mg/dL for St. Care and 132 +/- 42 mg/dL for iGlar by CBG or 138 +/- 35 mg/dL for St. Care and 129 +/- 34 mg/dL for iGlar by CGMS. Percentage of time in range (80-180 mg/dL) by CGMS was 73 +/- 18% for iGlar and 77 +/- 11% for St. Care. No severe hypoglycemia (<= 40 mg/dL) was detected by CBG, but CGMS indicated 11 (St. Care) and seven (iGlar) excursions in four subjects from each group, mostly in sulfonylurea users (six of eight patients). Conclusions: This pilot study suggests that equivalent glycemic control without increase in severe hyperglycemia may be achieved using iGlar with background iReg. Data outputs were controlled by both CBG and CGMS measurements in a real-life setting to ensure reliability. Based on CGMS measurements, there were significant numbers of glycemic excursions outside of the target range. However, this was not detected by CBG. In addition, the data indicate that previous use of sulfonylurea may be a potential major risk factor for severe hypoglycemia irrespective of the type of insulin treatment.
Resumo:
The classic conservative approach for thermal process design can lead to over-processing, especially for laminar flow, when a significant distribution of temperature and of residence time occurs. In order to optimize quality retention, a more comprehensive model is required. A model comprising differential equations for mass and heat transfer is proposed for the simulation of the continuous thermal processing of a non-Newtonian food in a tubular system. The model takes into account the contribution from heating and cooling sections, the heat exchange with the ambient air and effective diffusion associated with non-ideal laminar flow. The study case of soursop juice processing was used to test the model. Various simulations were performed to evaluate the effect of the model assumptions. An expressive difference in the predicted lethality was observed between the classic approach and the proposed model. The main advantage of the model is its flexibility to represent different aspects with a small computational time, making it suitable for process evaluation and design. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
This paper provides a paleoenvironmental reconstruction of a Late Quaternary lagoon system in the Jaguaruna region of Santa Catarina state, southern Brazil. Integrated results of bulk sedimentary organic matter characterization (delta C-13, delta N-15 and C/N), microfossil (pollen and diatom) and grain-size analysis from three shallow cores (similar to 2.5m depth) allowed us to propose an evolving paleogeographic scenario in this coastal region for the last ca. 5500 cal a BP. The lagoonal system in this area was more extensive during the mid-Holocene than today, with a gradual and continuous lagoon-sea disconnection until the present. We add to the debate regarding relative sea-level (RSL) variations for the Brazilian coast during the Holocene and discuss the importance of sedimentary dynamics for interpreting changes in coastal ecosystems. The multi-proxy analysis suggests that changes in coastal ecosystems could be directly related to local sedimentary processes, which are not necessarily linked to RSL fluctuations and/or to climatic variations. Copyright (c) 2011 John Wiley & Sons, Ltd.
Resumo:
Aims: Arthrospira platensis has been studied for single-cell protein production because of its biomass composition and its ability of growing in alternative media. This work evaluated the effects of different dilution rates (D) and urea concentrations (N0) on A.similar to platensis continuous culture, in terms of growth, kinetic parameters, biomass composition and nitrogen removal. Methods and results: Arthrospira platensis was continuously cultivated in a glass-made vertical column photobioreactor agitated with Rushton turbines. There were used different dilution rates (0.040.44 day-1) and urea concentrations (0.5 and 5 mmol l-1). With N0 = 5 mmol l-1, the maximum steady-state biomass concentration was1415 mg l-1, achieved with D = 0.04 day-1, but the highest protein content (71.9%) was obtained by applying D = 0.12 day-1, attaining a protein productivity of 106.41 mg l-1 day-1. Nitrogen removal reached 99% on steady-state conditions. Conclusions: The best results were achieved by applying N0 = 5 mmol l-1; however, urea led to inhibitory conditions at D = 0.16 day-1, inducing the system wash-out. The agitation afforded satisfactory mixture and did not harm the trichomes structure. Significance and Impact of the Study: These results can enhance the basis for the continuous removal of nitrogenous wastewater pollutants using cyanobacteria, with an easily assembled photobioreactor.
Resumo:
The aim of solving the Optimal Power Flow problem is to determine the optimal state of an electric power transmission system, that is, the voltage magnitude and phase angles and the tap ratios of the transformers that optimize the performance of a given system, while satisfying its physical and operating constraints. The Optimal Power Flow problem is modeled as a large-scale mixed-discrete nonlinear programming problem. This paper proposes a method for handling the discrete variables of the Optimal Power Flow problem. A penalty function is presented. Due to the inclusion of the penalty function into the objective function, a sequence of nonlinear programming problems with only continuous variables is obtained and the solutions of these problems converge to a solution of the mixed problem. The obtained nonlinear programming problems are solved by a Primal-Dual Logarithmic-Barrier Method. Numerical tests using the IEEE 14, 30, 118 and 300-Bus test systems indicate that the method is efficient. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Introduction: The aim of the present work was to evaluate the resistance to flexural fatigue of Reciproc R25 nickel-titanium files, 25 mm, used in continuous rotation motion or reciprocation motion, in dynamic assays device. Methods: Thirty-six Reciproc R25 files were divided into 2 groups (n = 18) according to kinematics applied, continuous rotary (group CR) and reciprocation motion (group RM). The files were submitted to dynamic assays device moved by an electric engine with 300 rpm of speed that permitted the reproduction of pecking motion. The files run on a ring's groove of temperate steel, simulating instrumentation of a curved root canal with 400 and 5 mm of curvature radius. The fracture of file was detected by sensor of device, and the time was marked. The data were analyzed statistically by Student's t test, with level of significance of 95%. Results: The instruments moved by reciprocating movement reached significantly higher numbers of cycles before fracture (mean, 1787.78 cycles) when compared with instruments moved by continuous rotary (mean, 816.39 cycles). Conclusions: The results showed that the reciprocation motion improves flexural fatigue resistance in nickel-titanium instrument Reciproc R25 when compared with continuous rotation movement. (J Endod 2012;38:684-687)
Resumo:
The assessment of the thermal process impact in terms of food safety and quality is of great importance for process evaluation and design. This can be accomplished from the analysis of the residence time and temperature distributions coupled with the kinetics of thermal change, or from the use of a proper time-temperature integrator (TTI) as indicator of safety and quality. The objective of this work was to develop and test enzymic TTIs with rapid detection for the evaluation of continuous HTST pasteurization processes (70-85 degrees C, 10-60 s) of low-viscosity liquid foods, such as milk and juices. Enzymes peroxidase, lactoperoxidase and alkaline phosphatase in phosphate buffer were tested and activity was determined with commercial reflectometric strips. Discontinuous thermal treatments at various time-temperature combinations were performed in order to adjust a first order kinetic model of a two-component system. The measured time-temperature history was considered instead of assuming isothermal conditions. Experiments with slow heating and cooling were used to validate the adjusted model. Only the alkaline phosphatase TTI showed potential to be used for the evaluation of pasteurization processes. The choice was based on the obtained z-values of the thermostable and thermolabile fractions, on the cost and on the validation tests. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
This work studies the optimization and control of a styrene polymerization reactor. The proposed strategy deals with the case where, because of market conditions and equipment deterioration, the optimal operating point of the continuous reactor is modified significantly along the operation time and the control system has to search for this optimum point, besides keeping the reactor system stable at any possible point. The approach considered here consists of three layers: the Real Time Optimization (RTO), the Model Predictive Control (MPC) and a Target Calculation (TC) that coordinates the communication between the two other layers and guarantees the stability of the whole structure. The proposed algorithm is simulated with the phenomenological model of a styrene polymerization reactor, which has been widely used as a benchmark for process control. The complete optimization structure for the styrene process including disturbances rejection is developed. The simulation results show the robustness of the proposed strategy and the capability to deal with disturbances while the economic objective is optimized.
Resumo:
This work studies the optimization and control of a styrene polymerization reactor. The proposed strategy deals with the case where, because of market conditions and equipment deterioration, the optimal operating point of the continuous reactor is modified significantly along the operation time and the control system has to search for this optimum point, besides keeping the reactor system stable at any possible point. The approach considered here consists of three layers: the Real Time Optimization (RTO), the Model Predictive Control (MPC) and a Target Calculation (TC) that coordinates the communication between the two other layers and guarantees the stability of the whole structure. The proposed algorithm is simulated with the phenomenological model of a styrene polymerization reactor, which has been widely used as a benchmark for process control. The complete optimization structure for the styrene process including disturbances rejection is developed. The simulation results show the robustness of the proposed strategy and the capability to deal with disturbances while the economic objective is optimized.
Resumo:
The continental margin off SE South America hosts one of the world’s most energetic hydrodynamic regimes but also the second largest drainage system of the continent. Both, the ocean current system as well as the fluvial runoff are strongly controlled by the atmospheric circulation modes over the region. The distribution pattern of particular types of sediments on shelf and slope and the long-term built-up of depositional elements within the overall margin architecture are, thus, the product of both, seasonal to millennial variability as well as long-term environmental trends. This talk presents how the combination of different methodological approaches can be used to obtain a comprehensive picture of the variability of a shelf and upper-slope hydrodynamic system during Holocene times. The particular methods applied are: (a) Margin-wide stratigraphic information to elucidate the role of sea level for the oceanographic and sedimentary systems since the last glacial maximum; (b) Palaeoceanographic sediment proxies combined with palaeo-temperature indicating isotopes of bivalve shells to trace lateral shifts in the coastal oceanography (particularly of the shelf front) during the Holocene; (c) Neodymium isotopes to identify the shelf sediment transport routes resulting from the current regime; (d) Sedimentological/geochemical data to show the efficient mechanism of sand export from the shelf to the open ocean; (e) Diatom assemblages and sediment element distributions indicating palaeo-salinity and the changing marine influence to illustrate the Plata runoff history. Sea level has not only controlled the overall configuration of the shelf but also the position of the main sediment routes from the continent towards the ocean. The shelf front has shifted frequently since the last glacial times probably resulting from both, changes in the Westerly Winds intensity and in the shelf width itself. Remarkable is a southward shift of this front during the past two centuries possibly related to anthropogenic influences on the atmosphere. The oceanographic regime with its prominent hydrographic boundaries led to a clear separation of sedimentary provinces since shelf drowning. It is especially the shelf front which enhances shelf sediment export through a continuous high sand supply to the uppermost slope. Finally, the Plata River does not continuously provide sediment to the shelf but shows significant climate-related changes in discharge during the past centuries. Starting from these findings, three major fields of research should, in general, be further developed in future: (i) The immediate interaction of the hydrodynamic and sedimentary systems to close the gaps between deposit information and modern oceanographic dynamics; (ii) Material budget calculations for the marginal ocean system in terms of material fluxes, storage/retention capacities, and critical thresholds; (iii) The role of human activity on the atmospheric, oceanographic and solid material systems to unravel natural vs. anthropogenic effects and feedback mechanisms
Resumo:
Hermite interpolation is increasingly showing to be a powerful numerical solution tool, as applied to different kinds of second order boundary value problems. In this work we present two Hermite finite element methods to solve viscous incompressible flows problems, in both two- and three-dimension space. In the two-dimensional case we use the Zienkiewicz triangle to represent the velocity field, and in the three-dimensional case an extension of this element to tetrahedra, still called a Zienkiewicz element. Taking as a model the Stokes system, the pressure is approximated with continuous functions, either piecewise linear or piecewise quadratic, according to the version of the Zienkiewicz element in use, that is, with either incomplete or complete cubics. The methods employ both the standard Galerkin or the Petrov–Galerkin formulation first proposed in Hughes et al. (1986) [18], based on the addition of a balance of force term. A priori error analyses point to optimal convergence rates for the PG approach, and for the Galerkin formulation too, at least in some particular cases. From the point of view of both accuracy and the global number of degrees of freedom, the new methods are shown to have a favorable cost-benefit ratio, as compared to velocity Lagrange finite elements of the same order, especially if the Galerkin approach is employed.