997 resultados para Signal generation
Resumo:
Although it is the best characterized in vitro model of GH action, the mechanisms used by GH to induce differentiation of murine 3T3-F442A preadipocytes remain unclear. Here we have examined the role of three transcriptional regulators in adipogenesis. These regulators are either rapidly induced in response to GH [Stra13, signal transducer and activator of transcription (Stat) 3] or of central importance to GH signaling (Stat5). Retroviral transfection of 3T3-F442A preadipocytes was used to increase expression of Stra13, Stat3, and Stat5a. Only Stat5a transfection increased the expression of adipogenic markers peroxisome proliferator-activated receptor gamma, CCAAT enhancer binding protein (C/EBP)alpha, and adipose protein 2/fatty acid-binding protein in response to GH, as determined by quantitative RT-PCR. Transfection with constitutively active Stat3 and Stat5a revealed that constitutively active Stat5a but not Stat3 was able to replace the GH requirement for adipogenesis. Constitutively active Stat5a but not Stat3 was able to increase the formation of lipid droplets and expression of alpha-glycerol phosphate dehydrogenase toward levels seen in mature adipocytes. Constitutively active Stat5a was also able to increase the expression of transcripts for C/EBPalpha to similar levels as GH, and of C/EBPbeta, peroxisome proliferator-activated receptor gamma, and adipose protein 2/fatty acid-binding protein transcripts to a lesser extent. An in vivo role for GH in murine adipogenesis is supported by significantly decreased epididymal fat depot size in young GH receptor-deleted mice, before manifestation of the lipolytic actions of GH. We conclude that Stat5 is a critical factor in GH-induced, and potentially prolactin-induced, murine adipogenesis.
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Background: Although excessive ethanol consumption is known to lead to a variety of adverse effects in the heart, the molecular mechanisms of such effects have remained poorly defined. We hypothesized that posttranslational covalent binding of reactive molecular species to proteins occurs in the heart in response to acute ethanol exposure. Methods: The generation of protein adducts with several aldehydic species was examined by using monospecific antibodies against adducts with malondialdehyde (MDA), acetaldehyde (AA), MDA-AA hybrids, and hydroxyethyl radicals. Specimens of heart tissue were obtained from rats after intraperitoneal injections with alcohol (75 mmol/kg body weight) with or without pretreatment with cyanamide (0.05 mmol/kg body weight), an aldehyde dehydrogenase inhibitor. Results: The amounts of MDA and unreduced AA adducts were found to be significantly increased in the heart of the rats treated with ethanol, cyanamide, or both, whereas no other adducts were detected in statistically significant quantities. Immunohistochemical studies for characterization of adduct distribution revealed sarcolemmal adducts of both MDA and AA in the rats treated with ethanol and cyanamide in addition to intracellular adducts, which were also present in the group treated with ethanol alone. Conclusions: These findings support the role of enhanced lipid peroxidation and the generation of protein-aldehyde condensates in vivo as a result of excessive ethanol intake. These findings may have implications in the molecular mechanisms of cardiac dysfunction in alcoholics.
Resumo:
The ability to monitor fetal heart rate is vital during late pregnancy and labor in order to evaluate fetal well-being. Current monitoring practice is essentially based on external cardiotocography and, less frequently, during labor, invasive fetal scalp electrocardiography. Many current and envisaged applications could benefi t from simpler devices using a 3-lead ECG confi guration. We are designing a maternity support belt with an embedded wireless 3-lead ECG sensor, and have investigated the infl uence of the ground electrode position on signal quality. Data from over 100 pregnant women was collected with the ground electrode placed in 3 locations in order to determine optimum electrode placement and belt form factor.
Resumo:
The paper proposes a methodology especially focused on the generation of strategic plans of action, emphasizing the relevance of having a structured timeframe classification for the actions. The methodology explicitly recognizes the relevance of long-term goals as strategic drivers, which must insure that the complex system is capable to effectively respond to changes in the environment. In addition, the methodology employs engineering systems techniques in order to understand the inner working of the system and to build up alternative plans of action. Due to these different aspects, the proposed approach features higher flexibility compared to traditional methods. The validity and effectiveness of the methodology has been demonstrated by analyzing an airline company composed by 5 subsystems with the aim of defining a plan of action for the next 5 years, which can either: improve efficiency, redefine mission or increase revenues.
Resumo:
Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical research. However, the resulting images present a low signal to noise ratio and a time intensity decay due to the photobleaching effect. This phenomenon is a consequence of the decreasing on the radiation emission efficiency of the tagging protein. This occurs because the fluorophore permanently loses its ability to fluoresce, due to photochemical reactions induced by the incident light. The Poisson multiplicative noise that corrupts these images, in addition with its quality degradation due to photobleaching, make long time biological observation processes very difficult. In this paper a denoising algorithm for Poisson data, where the photobleaching effect is explicitly taken into account, is described. The algorithm is designed in a Bayesian framework where the data fidelity term models the Poisson noise generation process as well as the exponential intensity decay caused by the photobleaching. The prior term is conceived with Gibbs priors and log-Euclidean potential functions, suitable to cope with the positivity constrained nature of the parameters to be estimated. Monte Carlo tests with synthetic data are presented to characterize the performance of the algorithm. One example with real data is included to illustrate its application.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
A two terminal optically addressed image processing device based on two stacked sensing/switching p-i-n a-SiC:H diodes is presented. The charge packets are injected optically into the p-i-n sensing photodiode and confined at the illuminated regions changing locally the electrical field profile across the p-i-n switching diode. A red scanner is used for charge readout. The various design parameters and addressing architecture trade-offs are discussed. The influence on the transfer functions of an a-SiC:H sensing absorber optimized for red transmittance and blue collection or of a floating anode in between is analysed. Results show that the thin a-SiC:H sensing absorber confines the readout to the switching diode and filters the light allowing full colour detection at two appropriated voltages. When the floating anode is used the spectral response broadens, allowing B&W image recognition with improved light-to-dark sensitivity. A physical model supports the image and colour recognition process.
Resumo:
Results on the use of a double a-SiC:H p-i-n heterostructure for signal multiplexing and demultiplexing applications in the visible range are presented. Pulsed monochromatic beams together (multiplexing mode), or a single polychromatic beam (demultiplexing mode) impinge on the device and are absorbed, accordingly to their wavelength. Red, green and blue pulsed input channels are transmitted together, each one with a specific transmission rate. The combined optical signal is analyzed by reading out, under different applied voltages, the generated photocurrent. Results show that in the multiplexing mode the output signal is balanced by the wavelength and transmission rate of each input channel, keeping the memory of the incoming optical carriers. In the demultiplexing mode the photocurrent is controlled by the applied voltage allowing regaining the transmitted information. A physical model supported by a numerical simulation gives insight into the device operation.
Resumo:
We study the implications for two-Higgs-doublet models of the recent announcement at the LHC giving a tantalizing hint for a Higgs boson of mass 125 GeV decaying into two photons. We require that the experimental result be within a factor of 2 of the theoretical standard model prediction, and analyze the type I and type II models as well as the lepton-specific and flipped models, subject to this requirement. It is assumed that there is no new physics other than two Higgs doublets. In all of the models, we display the allowed region of parameter space taking the recent LHC announcement at face value, and we analyze the W+W-, ZZ, (b) over barb, and tau(+)tau(-) expectations in these allowed regions. Throughout the entire range of parameter space allowed by the gamma gamma constraint, the numbers of events for Higgs decays into WW, ZZ, and b (b) over bar are not changed from the standard model by more than a factor of 2. In contrast, in the lepton-specific model, decays to tau(+)tau(-) are very sensitive across the entire gamma gamma-allowed region.
Resumo:
A Blumlein line is a particular Pulse Forming Line, PFL, configuration that allows the generation of high-voltage sub-microsecond square pulses, with the same voltage amplitude as the dc charging voltage, into a matching load. By stacking n Blumlein lines one can multiply in theory by n the input dc voltage charging amplitude. In order to understand the operating behavior of this electromagnetic system and to further optimize its operation it is fundamental to theoretically model it, that is to calculate the voltage amplitudes at each circuit point and the time instant that happens. In order to do this, one needs to define the reflection and transmission coefficients where impedance discontinuity occurs. The experimental results of a fast solid-state switch, which discharges a three stage Blumlein stack, will be compared with theoretical ones.
Resumo:
This paper presents an algorithm to efficiently generate the state-space of systems specified using the IOPT Petri-net modeling formalism. IOPT nets are a non-autonomous Petri-net class, based on Place-Transition nets with an extended set of features designed to allow the rapid prototyping and synthesis of system controllers through an existing hardware-software co-design framework. To obtain coherent and deterministic operation, IOPT nets use a maximal-step execution semantics where, in a single execution step, all enabled transitions will fire simultaneously. This fact increases the resulting state-space complexity and can cause an arc "explosion" effect. Real-world applications, with several million states, will reach a higher order of magnitude number of arcs, leading to the need for high performance state-space generator algorithms. The proposed algorithm applies a compilation approach to read a PNML file containing one IOPT model and automatically generate an optimized C program to calculate the corresponding state-space.
Resumo:
This work addresses the present-day (<100 ka) mantle heterogeneity in the Azores region through the study of two active volcanic systems from Terceira Island. Our study shows that mantle heterogeneities are detectable even when "coeval" volcanic systems (Santa Barbara and Fissural) erupted less than 10 km away. These volcanic systems, respectively, reflect the influence of the Terceira and D. Joao de Castro Bank end-members defined by Beier et at (2008) for the Terceira Rift Santa Barbara magmas are interpreted to be the result of mixing between a HIMU-type component, carried to the upper mantle by the Azores plume, and the regional depleted MORB magmas/source. Fissural lavas are characterized by higher Ba/Nb and Nb/U ratios and less radiogenic Pb-206/Pb-204, Nd-143/Nd-144 and Hf-176/Hf-177, requiring the small contribution of delaminated sub-continental lithospheric mantle residing in the upper mantle. Published noble gas data on lavas from both volcanic systems also indicate the presence of a relatively undegassed component, which is interpreted as inherited from a lower mantle reservoir sampled by the ascending Azores plume. As inferred from trace and major elements, melting began in the garnet stability field, while magma extraction occurred within the spinel zone. The intra-volcanic system's chemical heterogeneity is mainly explained by variable proportions of the above-mentioned local end-members and by crystal fractionation processes. (C) 2011 Elsevier By. All rights reserved.
Resumo:
The end consumers in a smart grid context are seen as active players. The distributed generation resources applied in smart home system as a micro and small-scale systems can be wind generation, photovoltaic and combine heat and power facility. The paper addresses the management of domestic consumer resources, i.e. wind generation, solar photovoltaic, combined heat and power, electric vehicle with gridable capability and loads, in a SCADA system with intelligent methodology to support the user decision in real time. The main goal is to obtain the better management of excess wind generation that may arise in consumer’s distributed generation resources. The optimization methodology is performed in a SCADA House Intelligent Management context and the results are analyzed to validate the SCADA system.
Using demand response to deal with unexpected low wind power generation in the context of smart grid
Resumo:
Demand response is assumed an essential resource to fully achieve the smart grids operating benefits, namely in the context of competitive markets. Some advantages of Demand Response (DR) programs and of smart grids can only be achieved through the implementation of Real Time Pricing (RTP). The integration of the expected increasing amounts of distributed energy resources, as well as new players, requires new approaches for the changing operation of power systems. The methodology proposed aims the minimization of the operation costs in a smart grid operated by a virtual power player. It is especially useful when actual and day ahead wind forecast differ significantly. When facing lower wind power generation than expected, RTP is used in order to minimize the impacts of such wind availability change. The proposed model application is here illustrated using the scenario of a special wind availability reduction day in the Portuguese power system (8th February 2012).