885 resultados para Deductive and Inductive Norm Internalization
Resumo:
First published online: 30 October 2015
Resumo:
We quantify the long-time behavior of a system of (partially) inelastic particles in a stochastic thermostat by means of the contractivity of a suitable metric in the set of probability measures. Existence, uniqueness, boundedness of moments and regularity of a steady state are derived from this basic property. The solutions of the kinetic model are proved to converge exponentially as t→ ∞ to this diffusive equilibrium in this distance metrizing the weak convergence of measures. Then, we prove a uniform bound in time on Sobolev norms of the solution, provided the initial data has a finite norm in the corresponding Sobolev space. These results are then combined, using interpolation inequalities, to obtain exponential convergence to the diffusive equilibrium in the strong L¹-norm, as well as various Sobolev norms.
Resumo:
My work seeks, to observe if there is some variation at some evolutionary stage starting from the knowledge obtained starting from the study of learning and development motor, with reference to a controlled group. My research details the types of aquatic abilities that take place as a consequence of a specific program. My investigation uses a quantitative methodology that compares the moments of appearance of different evolutionary stages in a first phase and a qualitative methodology for the second phase that makes those learnings thake place as a consequence of my intervention. To elaborate this process I have used an inductive and deductive method, I have based my reserach on fifteen years, of experience of these practices and used the data that has been provided to me from other's author's investigations.
Resumo:
We consider the Kudla-Millson lift from elliptic modular forms of weight (p+q)/2 to closed q-forms on locally symmetric spaces corresponding to the orthogonal group O(p,q). We study the L²-norm of the lift following the Rallis inner product formula. We compute the contribution at the Archimedian place. For locally symmetric spaces associated to even unimodular lattices, we obtain an explicit formula for the L²-norm of the lift, which often implies that the lift is injective. For O(p,2) we discuss how such injectivity results imply the surjectivity of the Borcherds lift.
Resumo:
In this paper we obtain necessary and sufficient conditions for double trigonometric series to belong to generalized Lorentz spaces, not symmetric in general. Estimates for the norms are given in terms of coefficients.
Resumo:
In this paper we introduce new functional spaces which we call the net spaces. Using their properties, the necessary and sufficient conditions for the integral operators to be of strong or weak-type are obtained. The estimates of the norm of the convolution operator in weighted Lebesgue spaces are presented.
Resumo:
Based on homology with GLUT1-5, we have isolated a cDNA for a novel glucose transporter, GLUTX1. This cDNA encodes a protein of 478 amino acids that shows between 29 and 32% identity with rat GLUT1-5 and 32-36% identity with plant and bacterial hexose transporters. Unlike GLUT1-5, GLUTX1 has a short extracellular loop between transmembrane domain (TM) 1 and TM2 and a long extracellular loop between TM9 and TM10 that contains the only N-glycosylation site. When expressed in Xenopus oocytes, GLUTX1 showed strong transport activity only after suppression of a dileucine internalization motif present in the amino-terminal region. Transport activity was inhibited by cytochalasin B and partly competed by D-fructose and D-galactose. The Michaelis-Menten constant for glucose was approximately 2 mM. When translated in reticulocytes lysates, GLUTX1 migrates as a 35-kDa protein that becomes glycosylated in the presence of microsomal membranes. Western blot analysis of GLUTX1 transiently expressed in HEK293T cells revealed a diffuse band with a molecular mass of 37-50 kDa that could be converted to a approximately 35-kDa polypeptide following enzymatic deglycosylation. Immunofluorescence microscopy detection of GLUTX1 transfected into HEK293T cells showed an intracellular staining. Mutation of the dileucine internalization motif induced expression of GLUTX1 at the cell surface. GLUTX1 mRNA was detected in testis, hypothalamus, cerebellum, brainstem, hippocampus, and adrenal gland. We hypothesize that, in a similar fashion to GLUT4, in vivo cell surface expression of GLUTX1 may be inducible by a hormonal or other stimulus.
Resumo:
The financial crisis, on the one hand, and the recourse to ‘unconventional’ monetary policy, on the other, have given a sharp jolt to perceptions of the role and status of central banks. In this paper we start with a brief ‘contrarian’ history of central banks since the second world war, which presents the Great Moderation and the restricted focus on inflation targeting as a temporary aberration from the norm. We then discuss how recent developments in fiscal and monetary policy have affected the role and status of central banks, notably their relationships with governments, before considering the environment central banks will face in the near and middle future and how they will have to change to address it.
Resumo:
Diffusion MRI is a well established imaging modality providing a powerful way to probe the structure of the white matter non-invasively. Despite its potential, the intrinsic long scan times of these sequences have hampered their use in clinical practice. For this reason, a large variety of methods have been recently proposed to shorten the acquisition times. Among them, spherical deconvolution approaches have gained a lot of interest for their ability to reliably recover the intra-voxel fiber configuration with a relatively small number of data samples. To overcome the intrinsic instabilities of deconvolution, these methods use regularization schemes generally based on the assumption that the fiber orientation distribution (FOD) to be recovered in each voxel is sparse. The well known Constrained Spherical Deconvolution (CSD) approach resorts to Tikhonov regularization, based on an ℓ(2)-norm prior, which promotes a weak version of sparsity. Also, in the last few years compressed sensing has been advocated to further accelerate the acquisitions and ℓ(1)-norm minimization is generally employed as a means to promote sparsity in the recovered FODs. In this paper, we provide evidence that the use of an ℓ(1)-norm prior to regularize this class of problems is somewhat inconsistent with the fact that the fiber compartments all sum up to unity. To overcome this ℓ(1) inconsistency while simultaneously exploiting sparsity more optimally than through an ℓ(2) prior, we reformulate the reconstruction problem as a constrained formulation between a data term and a sparsity prior consisting in an explicit bound on the ℓ(0)norm of the FOD, i.e. on the number of fibers. The method has been tested both on synthetic and real data. Experimental results show that the proposed ℓ(0) formulation significantly reduces modeling errors compared to the state-of-the-art ℓ(2) and ℓ(1) regularization approaches.
Resumo:
We obtain upper and lower estimates of the (p; q) norm of the con-volution operator. The upper estimate sharpens the Young-type inequalities due to O'Neil and Stepanov.
Resumo:
Ectodermal organogenesis is regulated by inductive and reciprocal signalling cascades that involve multiple signal molecules in several conserved families. Ectodysplasin-A (Eda), a tumour necrosis factor-like signalling molecule, and its receptor Edar are required for the development of a number of ectodermal organs in vertebrates. In mice, lack of Eda leads to failure in primary hair placode formation and missing or abnormally shaped teeth, whereas mice overexpressing Eda are characterized by enlarged hair placodes and supernumerary teeth and mammary glands. Here, we report two signalling outcomes of the Eda pathway: suppression of bone morphogenetic protein (Bmp) activity and upregulation of sonic hedgehog (Shh) signalling. Recombinant Eda counteracted Bmp4 activity in developing teeth and, importantly, inhibition of BMP activity by exogenous noggin partially restored primary hair placode formation in Eda-deficient skin in vitro, indicating that suppression of Bmp activity was compromised in the absence of Eda. The downstream effects of the Eda pathway are likely to be mediated by transcription factor nuclear factor-kappaB (NF-kappaB), but the transcriptional targets of Edar have remained unknown. Using a quantitative approach, we show in cultured embryonic skin that Eda induced the expression of two Bmp inhibitors, Ccn2/Ctgf (CCN family protein 2/connective tissue growth factor) and follistatin. Moreover, our data indicate that Shh is a likely transcriptional target of Edar, but, unlike noggin, recombinant Shh was unable to rescue primary hair placode formation in Eda-deficient skin explants.
Resumo:
The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.
Resumo:
When dealing with sustainability we are concerned with the biophysical as well as the monetary aspects of economic and ecological interactions. This multidimensional approach requires that special attention is given to dimensional issues in relation to curve fitting practice in economics. Unfortunately, many empirical and theoretical studies in economics, as well as in ecological economics, apply dimensional numbers in exponential or logarithmic functions. We show that it is an analytical error to put a dimensional unit x into exponential functions ( a x ) and logarithmic functions ( x a log ). Secondly, we investigate the conditions of data sets under which a particular logarithmic specification is superior to the usual regression specification. This analysis shows that logarithmic specification superiority in terms of least square norm is heavily dependent on the available data set. The last section deals with economists’ “curve fitting fetishism”. We propose that a distinction be made between curve fitting over past observations and the development of a theoretical or empirical law capable of maintaining its fitting power for any future observations. Finally we conclude this paper with several epistemological issues in relation to dimensions and curve fitting practice in economics
Resumo:
Macrophages and muscle cells are the main targets for invasion of Trypanosoma cruzi. Ultrastructural studies of this phenomenon in vitro showed that invasion occurs by endocytosis, with attachment and internalization being mediated by different components capable of recognizing epi-or trypomastigotes (TRY). A parasitophorus vacuole was formed in both cell types, thereafter fusing with lysosomes. Then, the mechanism of T. cruzi invasion of host cells (HC) is essentially similar (during a primary infection in the abscence of a specific immune response), regardless of wether the target cell is a professional or a non-professional phagocytic cell. Using sugars, lectins, glycosidases, proteinases and proteinase inhibitors, we observed that the relative balance between exposed sialic acid and galactose/N-acetyl galactosamine (GAL) residues on the TRY surface, determines the parasite's capacity to invade HC, and that lectin-mediated phagocytosis with GAL specificity is important for internalization of T. cruzi into macrophages. On the other hand, GAL on the surface to heart muscle cells participate on TRY adhesion. TRY need to process proteolytically both the HC and their own surface, to expose the necessary ligands and receptors that allow binding to, and internalization in the host cell. The diverse range of molecular mechanisms which the parasite could use to invade the host cell may correspond to differences in the available "receptors"on the surface of each specific cell type. Acute phase components, with lectin or proteinase inhibitory activities (a-macroglobulins), may also be involved in T. cruzi-host cell interaction.
Resumo:
This dissertation focuses on the practice of regulatory governance, throughout the study of the functioning of formally independent regulatory agencies (IRAs), with special attention to their de facto independence. The research goals are grounded on a "neo-positivist" (or "reconstructed positivist") position (Hawkesworth 1992; Radaelli 2000b; Sabatier 2000). This perspective starts from the ontological assumption that even if subjective perceptions are constitutive elements of political phenomena, a real world exists beyond any social construction and can, however imperfectly, become the object of scientific inquiry. Epistemologically, it follows that hypothetical-deductive theories with explanatory aims can be tested by employing a proper methodology and set of analytical techniques. It is thus possible to make scientific inferences and general conclusions to a certain extent, according to a Bayesian conception of knowledge, in order to update the prior scientific beliefs in the truth of the related hypotheses (Howson 1998), while acknowledging the fact that the conditions of truth are at least partially subjective and historically determined (Foucault 1988; Kuhn 1970). At the same time, a sceptical position is adopted towards the supposed disjunction between facts and values and the possibility of discovering abstract universal laws in social science. It has been observed that the current version of capitalism corresponds to the golden age of regulation, and that since the 1980s no government activity in OECD countries has grown faster than regulatory functions (Jacobs 1999). Following an apparent paradox, the ongoing dynamics of liberalisation, privatisation, decartelisation, internationalisation, and regional integration hardly led to the crumbling of the state, but instead promoted a wave of regulatory growth in the face of new risks and new opportunities (Vogel 1996). Accordingly, a new order of regulatory capitalism is rising, implying a new division of labour between state and society and entailing the expansion and intensification of regulation (Levi-Faur 2005). The previous order, relying on public ownership and public intervention and/or on sectoral self-regulation by private actors, is being replaced by a more formalised, expert-based, open, and independently regulated model of governance. Independent regulation agencies (IRAs), that is, formally independent administrative agencies with regulatory powers that benefit from public authority delegated from political decision makers, represent the main institutional feature of regulatory governance (Gilardi 2008). IRAs constitute a relatively new technology of regulation in western Europe, at least for certain domains, but they are increasingly widespread across countries and sectors. For instance, independent regulators have been set up for regulating very diverse issues, such as general competition, banking and finance, telecommunications, civil aviation, railway services, food safety, the pharmaceutical industry, electricity, environmental protection, and personal data privacy. Two attributes of IRAs deserve a special mention. On the one hand, they are formally separated from democratic institutions and elected politicians, thus raising normative and empirical concerns about their accountability and legitimacy. On the other hand, some hard questions about their role as political actors are still unaddressed, though, together with regulatory competencies, IRAs often accumulate executive, (quasi-)legislative, and adjudicatory functions, as well as about their performance.