924 resultados para l1-norm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND/OBJECTIVES: Serum amyloid A (SAA) is an acute-phase protein that has been recently correlated with obesity and insulin resistance. Therefore, we first examined whether human recombinant SAA (rSAA) could affect the proliferation, differentiation and metabolism of 3T3-L1 preadipocytes. DESIGN: Preadipocytes were treated with rSAA and analyzed for changes in viability and [H-3-methyl]-thymidine incorporation as well as cell cycle perturbations using flow cytometry analysis. The mRNA expression profiles of adipogenic factors during the differentiation protocol were also analyzed using real-time PCR. After differentiation, 2-deoxy-[1,2-H-3]-glucose uptake and glycerol release were evaluated. RESULTS: rSAA treatment caused a 2.6-fold increase in cell proliferation, which was consistent with the results from flow cytometry showing that rSAA treatment augmented the percentage of cells in the S phase (60.9 +/- 0.54%) compared with the control cells (39.8 +/- 2.2%, ***P<0.001). The rSAA-induced cell proliferation was mediated by the ERK1/2 signaling pathway, which was assessed by pretreatment with the inhibitor PD98059. However, the exposure of 3T3-L1 cells to rSAA during the differentiation process resulted in attenuated adipogenesis and decreased expression of adipogenesis-related factors. During the first 72 h of differentiation, rSAA inhibited the differentiation process by altering the mRNA expression kinetics of adipogenic transcription factors and proteins, such as PPAR gamma 2 (peroxisome proliferator-activated receptor gamma 2), C/EBP beta (CCAAT/enhancer-binding protein beta) and GLUT4. rSAA prevented the intracellular accumulation of lipids and, in fully differentiated cells, increased lipolysis and prevented 2-deoxy-[1,2-H-3]-glucose uptake, which favors insulin resistance. Additionally, rSAA stimulated the secretion of proinflammatory cytokines interleukin 6 and tumor necrosis factor alpha, and upregulated SAA3 mRNA expression during adipogenesis. CONCLUSIONS: We showed that rSAA enhanced proliferation and inhibited differentiation in 3T3-L1 preadipocytes and altered insulin sensitivity in differentiated cells. These results highlight the complex role of SAA in the adipogenic process and support a direct link between obesity and its co-morbidities such as type II diabetes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents preliminary results to determine small displacements of a global positioning system (GPS) antenna fastened to a structure using only one L1 GPS receiver. Vibrations, periodic or not, are common in large structures, such as bridges, footbridges, tall buildings, and towers under dynamic loads. The behavior in time and frequency leads to structural analysis studies. The hypothesis of this article is that any large structure that presents vibrations in the centimeter-to-millimeter range can be monitored by phase measurements of a single L1 receiver with a high data rate, as long as the direction of the displacement is pointing to a particular satellite. Within this scenario, the carrier phase will be modulated by antenna displacement. During a period of a few dozen seconds, the relative displacement to the satellite, the satellite clock, and the atmospheric phase delays can be assumed as a polynomial time function. The residuals from a polynomial adjustment contain the phase modulation owing to small displacements, random noise, receiver clock short time instabilities, and multipath. The results showed that it is possible to detect displacements of centimeters in the phase data of a single satellite and millimeters in the difference between the phases of two satellites. After applying a periodic nonsinusoidal displacement of 10 m to the antenna, it is clearly recovered in the difference of the residuals. The time domain spectrum obtained by the fast Fourier transform (FFT) exhibited a defined peak of the third harmonic much more than the random noise using the proposed third-degree polynomial model. DOI: 10.1061/(ASCE)SU.1943-5428.0000070. (C) 2012 American Society of Civil Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adiponectin and interleukin 10 (IL-10) are adipokines that are predominantly secreted by differentiated adipocytes and are involved in energy homeostasis, insulin sensitivity, and the anti-inflammatory response. These two adipokines are reduced in obese subjects, which favors increased activation of nuclear factor kappa B (NF-kappa B) and leads to elevation of pro-inflammatory adipokines. However, the effects of adiponectin and IL-10 on NF-kappa B DNA binding activity (NF-kappa Bp50 and NF-kappa Bp65) and proteins involved with the toll-like receptor (TLR-2 and TLR-4) pathway, such as MYD88 and TRAF6 expression, in lipopolysaccharide-treated 3T3-L1 adipocytes are unknown. Stimulation of lipopolysaccharide-treated 3T3-L1 adipocytes for 24 h elevated IL-6 levels; activated the NF-kappa B pathway cascade; increased protein expression of IL-6R, TLR-4, MYD88, and TRAF6; and increased the nuclear activity of NF-kappa B (p50 and p65) DNA binding. Adiponectin and IL-10 inhibited the elevation of IL-6 levels and activated NF-kappa B (p50 and p65) DNA binding. Taken together, the present results provide evidence that adiponectin and IL-10 have an important role in the anti-inflammatory response in adipocytes. In addition, inhibition of NF-kappa B signaling pathways may be an excellent strategy for the treatment of inflammation in obese individuals. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A systematic approach to model nonlinear systems using norm-bounded linear differential inclusions (NLDIs) is proposed in this paper. The resulting NLDI model is suitable for the application of linear control design techniques and, therefore, it is possible to fulfill certain specifications for the underlying nonlinear system, within an operating region of interest in the state-space, using a linear controller designed for this NLDI model. Hence, a procedure to design a dynamic output feedback controller for the NLDI model is also proposed in this paper. One of the main contributions of the proposed modeling and control approach is the use of the mean-value theorem to represent the nonlinear system by a linear parameter-varying model, which is then mapped into a polytopic linear differential inclusion (PLDI) within the region of interest. To avoid the combinatorial problem that is inherent of polytopic models for medium- and large-sized systems, the PLDI is transformed into an NLDI, and the whole process is carried out ensuring that all trajectories of the underlying nonlinear system are also trajectories of the resulting NLDI within the operating region of interest. Furthermore, it is also possible to choose a particular structure for the NLDI parameters to reduce the conservatism in the representation of the nonlinear system by the NLDI model, and this feature is also one important contribution of this paper. Once the NLDI representation of the nonlinear system is obtained, the paper proposes the application of a linear control design method to this representation. The design is based on quadratic Lyapunov functions and formulated as search problem over a set of bilinear matrix inequalities (BMIs), which is solved using a two-step separation procedure that maps the BMIs into a set of corresponding linear matrix inequalities. Two numerical examples are given to demonstrate the effectiveness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stone Age research on Northern Europe frequently makes gross generalizations about the Mesolithic and Neolithic, although we still lack much basic knowledge on how the people lived. The transition from the Mesolithic to the Neolithic in Europe has been described as a radical shift from an economy dominated by marine resources to one solely dependent on farming. Both the occurrence and the geographical extent of such a drastic shift can be questioned, however. It is therefore important to start out at a more detailed level of evidence in order to present the overall picture, and to account for the variability even in such regional or chronological overviews. Fifteen Stone Age sites were included in this study, ranging chronologically from the Early Mesolithic to the Middle or Late Neolithic, c. 8300–2500 BC, and stretching geographically from the westernmost coast of Sweden to the easternmost part of Latvia within the confines of latitudes 55–59° N. The most prominent sites in terms of the number of human and faunal samples analysed are Zvejnieki, Västerbjers and Skateholm I–II. Human and faunal skeletal remains were subjected to stable carbon and nitrogen isotope analysis to study diet and ecology at the sites. Stable isotope analyses of human remains provide quantitative information on the relative importance of various food sources, an important addition to the qualitative data supplied by certain artefacts and structures or by faunal or botanical remains. A vast number of new radiocarbon dates were also obtained. In conclusion, a rich diversity in Stone Age dietary practice in the Baltic Region was demonstrated. Evidence ranging from the Early Mesolithic to the Late Neolithic show that neither chronology nor location alone can account for this variety, but that there are inevitably cultural factors as well. Food habits are culturally governed, and therefore we cannot automatically assume that people at similar sites will have the same diet. Stable isotope studies are very important here, since they tell us what people actually consumed, not only what was available, or what one single meal contained. We should not be deceived in inferring diet from ritually deposited remains, since things that were mentally important were not always important in daily life. Thus, although a ritual and symbolic norm may emphasize certain food categories, these may in fact contribute very little to the diet. By the progress of analysis of intra-individual variation, new data on life history changes have been produced, revealing mobility patterns, breastfeeding behaviour and certain dietary transitions. The inclusion of faunal data has proved invaluable for understanding the stable isotope ecology of a site, and thereby improve the precision of the interpretations of human stable isotope data. The special case of dogs, though, demonstrates that these animals are not useful for inferring human diet, since, due to the number of roles they possess in human society, dogs could deviate significantly from humans in their diet, and in several cases have been proved to do so. When evaluating radiocarbon data derived from human and animal remains from the Pitted-Ware site of Västerbjers on Gotland, the importance of establishing the stable isotope ecology of the site before making deductions on reservoir effects was further demonstrated. The main aim of this thesis has been to demonstrate the variation and diversity in human practices, challenging the view of a “monolithic” Stone Age. By looking at individuals and not only at populations, the whole range of human behaviour has been accounted for, also revealing discrepancies between norm and practice, which are frequently visible both in the archaeological record and in present-day human behaviour.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] We analyze the discontinuity preserving problem in TV-L1 optical flow methods. This type of methods typically creates rounded effects at flow boundaries, which usually do not coincide with object contours. A simple strategy to overcome this problem consists in inhibiting the diffusion at high image gradients. In this work, we first introduce a general framework for TV regularizers in optical flow and relate it with some standard approaches. Our survey takes into account several methods that use decreasing functions for mitigating the diffusion at image contours. Consequently, this kind of strategies may produce instabilities in the estimation of the optical flows. Hence, we study the problem of instabilities and show that it actually arises from an ill-posed formulation. From this study, it is possible to come across with different schemes to solve this problem. One of these consists in separating the pure TV process from the mitigating strategy. This has been used in another work and we demonstrate here that it has a good performance. Furthermore, we propose two alternatives to avoid the instability problems: (i) we study a fully automatic approach that solves the problem based on the information of the whole image; (ii) we derive a semi-automatic approach that takes into account the image gradients in a close neighborhood adapting the parameter in each position. In the experimental results, we present a detailed study and comparison between the different alternatives. These methods provide very good results, especially for sequences with a few dominant gradients. Additionally, a surprising effect of these approaches is that they can cope with occlusions. This can be easily achieved by using strong regularizations and high penalizations at image contours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]Approximate inverses, based on Frobenius norm minimization, of real nonsingular matrices are analyzed from a purely theoretical point of view. In this context, this paper provides several sufficient conditions, that assure us the possibility of improving (in the sense of the Frobenius norm) some given approximate inverses. Moreover, the optimal approximate inverses of matrix A ∈ R n×n , among all matrices belonging to certain subspaces of R n×n , are obtained. Particularly, a natural generalization of the classical normal equations of the system Ax = b is given, when searching for approximate inverses N 6= AT such that AN is symmetric and kAN − IkF < AAT − I F …

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray absorption spectroscopy (XAS) is a powerful means of investigation of structural and electronic properties in condensed -matter physics. Analysis of the near edge part of the XAS spectrum, the so – called X-ray Absorption Near Edge Structure (XANES), can typically provide the following information on the photoexcited atom: - Oxidation state and coordination environment. - Speciation of transition metal compounds. - Conduction band DOS projected on the excited atomic species (PDOS). Analysis of XANES spectra is greatly aided by simulations; in the most common scheme the multiple scattering framework is used with the muffin tin approximation for the scattering potential and the spectral simulation is based on a hypothetical, reference structure. This approach has the advantage of requiring relatively little computing power but in many cases the assumed structure is quite different from the actual system measured and the muffin tin approximation is not adequate for low symmetry structures or highly directional bonds. It is therefore very interesting and justified to develop alternative methods. In one approach, the spectral simulation is based on atomic coordinates obtained from a DFT (Density Functional Theory) optimized structure. In another approach, which is the object of this thesis, the XANES spectrum is calculated directly based on an ab – initio DFT calculation of the atomic and electronic structure. This method takes full advantage of the real many-electron final wavefunction that can be computed with DFT algorithms that include a core-hole in the absorbing atom to compute the final cross section. To calculate the many-electron final wavefunction the Projector Augmented Wave method (PAW) is used. In this scheme, the absorption cross section is written in function of several contributions as the many-electrons function of the finale state; it is calculated starting from pseudo-wavefunction and performing a reconstruction of the real-wavefunction by using a transform operator which contains some parameters, called partial waves and projector waves. The aim of my thesis is to apply and test the PAW methodology to the calculation of the XANES cross section. I have focused on iron and silicon structures and on some biological molecules target (myoglobin and cytochrome c). Finally other inorganic and biological systems could be taken into account for future applications of this methodology, which could become an important improvement with respect to the multiscattering approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In questa tesi è stato trattato il problema della ricostruzione di immagini di tomografia computerizzata considerando un modello che utilizza la variazione totale come termine di regolarizzazione e la norma 1 come fidelity term (modello TV/L1). Il problema è stato risolto modificando un metodo di minimo alternato utilizzato per il deblurring e denoising di immagini affette da rumore puntuale. Il metodo è stato testato nel caso di rumore gaussiano e geometria fan beam e parallel beam. Infine vengono riportati i risultati ottenuti dalle sperimentazioni.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I dati derivanti da spettroscopia NMR sono l'effetto di fenomeni descritti attraverso la trasformata di Laplace della sorgente che li ha prodotti. Ci si riferisce a un problema inverso con dati discreti ed in relazione ad essi nasce l'esigenza di realizzare metodi numerici per l'inversione della trasformata di Laplace con dati discreti che è notoriamente un problema mal posto e pertanto occorre ricorrere a metodi di regolarizzazione. In questo contesto si propone una variante ai modelli presenti il letteratura che fanno utilizzo della norma L2, introducendo la norma L1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scopo dell'opera è implementare in maniera efficiente ed affidabile un metodo di tipo Newton per la ricostruzione di immagini con termine regolativo in norma L1. In particolare due metodi, battezzati "OWL-QN per inversione" e "OWL-QN precondizionato", sono presentati e provati con numerose sperimentazioni. I metodi sono generati considerando le peculiarità del problema e le proprietà della trasformata discreta di Fourier. I risultati degli esperimenti numerici effettuati mostrano la bontà del contributo proposto, dimostrando la loro superiorità rispetto al metodo OWL-QN presente in letteratura, seppure adattato alle immagini.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'idea di questo studio nasce dalla volontà di capire quali sono le competenze necessarie affinché un interprete riesca a comunicare in modo efficace. In particolare, abbiamo voluto effettuare un'analisi delle competenze comunicative degli interpreti nella loro lingua madre (in questo caso, l'italiano), poiché spesso questa competenza viene data per scontata, per il semplice fatto che si tratta della lingua parlata sin dall'infanzia. Per raggiungere questo obiettivo abbiamo trascritto e analizzato 20 interpretazioni consecutive, realizzate dagli studenti del DIT (Dipartimento di Interpretazione e Traduzione) di Forlì, durante gli esami finali che hanno avuto luogo tra il 2009 e il 2014. Il presente elaborato è suddiviso in quattro capitoli: nel primo capitolo, di natura teorica, proporremo una definizione del concetto di competenza comunicativa, che sarà divisa a sua volta in competenza linguistica, pragmatica e socio-pragmatica. Successivamente, metteremo in rilievo le altre competenze che un interprete di conferenza professionista dovrebbe avere. Il secondo capitolo sarà invece dedicato alla descrizione del concetto di qualità in interpretazione, principio estremamente importante per capire quali caratteristiche deve soddisfare il discorso prodotto da un interprete. Nel terzo capitolo, di natura pratica, presenteremo la modalità di raccolta dei dati e il risultato dell'analisi delle interpretazioni consecutive trascritte; mettendo in risalto alcune problematiche a livello comunicativo come interferenze linguistiche, calchi, problemi a livello lessicale, morfosintattico, incoerenze e omissioni. Infine, nel quarto capitolo proporremo alcune considerazioni conclusive sui temi trattati nei capitoli precedenti.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il lavoro di questa tesi riguarda principalmente l'upgrade, la simulazione e il test di schede VME chiamate ReadOut Driver (ROD), che sono parte della catena di elaborazione ed acquisizione dati di IBL (Insertable B-Layer). IBL è il nuovo componente del Pixel Detector dell'esperimento ATLAS al Cern che è stato inserito nel detector durante lo shut down di LHC; fino al 2012 infatti il Pixel Detector era costituito da tre layer, chiamati (partendo dal più interno): Barrel Layer 0, Layer 1 e Layer 2. Tuttavia, l'aumento di luminosità di LHC, l'invecchiamento dei pixel e la richiesta di avere misure sempre più precise, portarono alla necessità di migliorare il rivelatore. Così, a partire dall'inizio del 2013, IBL (che fino a quel momento era stato un progetto sviluppato e finanziato separatamente dal Pixel Detector) è diventato parte del Pixel Detector di ATLAS ed è stato installato tra la beam-pipe e il layer B0. Questa tesi fornirà innanzitutto una panoramica generale dell'esperimento ATLAS al CERN, includendo aspetti sia fisici sia tecnici, poi tratterà in dettaglio le varie parti del rivelatore, con particolare attenzione su Insertable B-Layer. Su quest'ultimo punto la tesi si focalizzerà sui motivi che ne hanno portato alla costruzione, sugli aspetti di design, sulle tecnologie utilizzate (volte a rendere nel miglior modo possibile compatibili IBL e il resto del Pixel Detector) e sulle scelte di sviluppo e fabbricazione. La tesi tratterà poi la catena di read-out dei dati, descrivendo le tecniche di interfacciamento con i chip di front-end, ed in particolare si concentrerà sul lavoro svolto per l'upgrade e lo sviluppo delle schede ReadOut Drivers (ROD) introducendo le migliorie da me apportate, volte a eliminare eventuali difetti, migliorare le prestazioni ed a predisporre il sistema ad una analisi prestazionale del rivelatore. Allo stato attuale le schede sono state prodotte e montate e sono già parte del sistema di acquisizione dati del Pixel Detector di ATLAS, ma il firmware è in continuo aggiornamento. Il mio lavoro si è principalmente focalizzato sul debugging e il miglioramento delle schede ROD; in particolare ho aggiunto due features: - programmazione parallela delle FPGA} delle ROD via VME. IBL richiede l'utilizzo di 15 schede ROD e programmandole tutte insieme (invece che una alla volta) porta ad un sensibile guadagno nei tempi di programmazione. Questo è utile soprattutto in fase di test; - reset del Phase-Locked Loop (PLL)} tramite VME. Il PLL è un chip presente nelle ROD che distribuisce il clock a tutte le componenti della scheda. Avere la possibilità di resettare questo chip da remoto permette di risolvere problemi di sincronizzazione. Le ReadOut Driver saranno inoltre utilizzate da più layer del Pixel Detector. Infatti oltre ad IBL anche i dati provenienti dai layer 1 e 2 dei sensori a pixel dell’esperimento ATLAS verranno acquisiti sfruttando la catena hardware progettata, realizzata e testata a Bologna.