779 resultados para Continuous Spaces
Resumo:
The small intestine is a dynamic and complex organ that is characterized by constant epithelium turnover and crosstalk among various cell types and the microbiota. Lymphatic capillaries of the small intestine, called lacteals, play key roles in dietary fat absorption and the gut immune response; however, little is known about the molecular regulation of lacteal function. Here, we performed a high-resolution analysis of the small intestinal stroma and determined that lacteals reside in a permanent regenerative, proliferative state that is distinct from embryonic lymphangiogenesis or quiescent lymphatic vessels observed in other tissues. We further demonstrated that this continuous regeneration process is mediated by Notch signaling and that the expression of the Notch ligand delta-like 4 (DLL4) in lacteals requires activation of VEGFR3 and VEGFR2. Moreover, genetic inactivation of Dll4 in lymphatic endothelial cells led to lacteal regression and impaired dietary fat uptake. We propose that such a slow lymphatic regeneration mode is necessary to match a unique need of intestinal lymphatic vessels for both continuous maintenance, due to the constant exposure to dietary fat and mechanical strain, and efficient uptake of fat and immune cells. Our work reveals how lymphatic vessel responses are shaped by tissue specialization and uncover a role for continuous DLL4 signaling in the function of adult lymphatic vasculature.
Resumo:
This paper examines the role that existing Latin American policy institutions and regulatory coordination mechanisms (otherwise referred to as “regional regulatory spaces”) play in innovation and development of the ICT sector. In doing so, it recognizes that sector regulation does not currently match regional development, thereby limiting its potential progress. In order to shed light on the role that the “regional regulatory space” could potentially play, the author addresses three main questions: · Is there a hierarchy of policies by which it is presumed easy to “coordinate within well-defined technical subjects” but extremely challenging to “agree in matters of public policy?” · What would happen if policy divergence became more important than convergence? Is it reasonable to consider creating “regional regulatory spaces?” or should we focus solely on technological coordination? · Do institutions capable of serving as effective regional regulatory spaces already exist in Latin America or should we consider modifying existing institutions or creating new institutions? After analyzing these three overarching areas of concern, this paper then discusses the need to create a regional space in order to harmonize ICT regulatory frameworks and public policies. Ultimately, this work aims to advance the institutional formulation beyond pre-existing efforts.
Resumo:
The thesis discusses games and the gaming experience. It is divided into two main sections; the first examines games in general, while the second concentrates exclusively on electronic games. The text approaches games from two distinct directions by looking at both their spatiality and their narrativity at the same time. These two points of view are combined right from the beginning of the text as they are used in conceptualising the nature of the gaming experience. The purpose of the thesis is to investigate two closely related issues concerning both the field of game studies and the nature of games. In regard to studying games, the focus is placed on the juxtaposition of ludology and narratology, which acts as a framework for looking at gaming. In addition to aiming to find out whether or not it is possible to undermine the said state of affairs through the spatiality of games, the text looks at the interrelationships of games and their spaces as well as the role of narratives in those spaces. The thesis is characterised by discussing alternative points of view and its hypothetical nature. During the text, it becomes apparent that the relationship between games and narratives is strongly twofold: on one hand, the player continuously narrativizes the states the game is in while playing, while the narratives residing within the game space form their own partially separate narrative spaces, on the other. These spaces affect the conception the player has of the game states and the events taking place in the game space itself.
Resumo:
Chromogenic immunohistochemistry (IHC) is omnipresent in cancer diagnosis, but has also been criticized for its technical limit in quantifying the level of protein expression on tissue sections, thus potentially masking clinically relevant data. Shifting from qualitative to quantitative, immunofluorescence (IF) has recently gained attention, yet the question of how precisely IF can quantify antigen expression remains unanswered, regarding in particular its technical limitations and applicability to multiple markers. Here we introduce microfluidic precision IF, which accurately quantifies the target expression level in a continuous scale based on microfluidic IF staining of standard tissue sections and low-complexity automated image analysis. We show that the level of HER2 protein expression, as continuously quantified using microfluidic precision IF in 25 breast cancer cases, including several cases with equivocal IHC result, can predict the number of HER2 gene copies as assessed by fluorescence in situ hybridization (FISH). Finally, we demonstrate that the working principle of this technology is not restricted to HER2 but can be extended to other biomarkers. We anticipate that our method has the potential of providing automated, fast and high-quality quantitative in situ biomarker data using low-cost immunofluorescence assays, as increasingly required in the era of individually tailored cancer therapy.
Resumo:
The present article contributes to the ongoing academic debate on migrants' appropriation of artistic and political spaces in Germany. Cologne, one of the largest cities in Germany, is an interesting example of the tension between political discourse centred around multiculturalism and cultural segregation processes. The 'no fool is illegal' carnival organised by asylum seekers shows their capacity to act, as they reinvent an old local tradition by reinterpreting medieval rituals. Today, different groups and associations appropriate this festive art space: migrants, gays and lesbians, feminists and far-left groups either organise their own parties or take part in the official parties and parades as separate groups. As a result, the celebration of diversity figures on the local political agenda and becomes part of the official carnival festivities. This leads to a blurring of boundaries, whereby mainstream popular culture becomes more and more influenced by multicultural elements.
Resumo:
PURPOSE: Obstructive sleep apnea syndrome (OSA) increases the risk of cardiovascular disease. We aimed at evaluating the effect of continuous positive airway pressure (CPAP) treatment on coronary endothelium-dependent vasoreactivity in OSA patients by quantifying myocardial blood flow (MBF) response to cold pressure testing (CPT). METHODS: In the morning after polysomnography (PSG), all participants underwent a dynamic (82)Rb cardiac positron emitting tomography/computed tomography (PET/CT) scan at rest, during CPT and adenosine stress. PSG and PET/CT were repeated at least 6 weeks after initiating CPAP treatment. OSA patients were compared to controls and according to response to CPAP. Patients' characteristics and PSG parameters were used to determine predictors of CPT-MBF. RESULTS: Thirty-two untreated OSA patients (age 58 ± 13 years, 27 men) and 9 controls (age 62 ± 5 years, 4 men) were enrolled. At baseline, compared to controls (apnea-hypopnea index (AHI) = 5.3 ± 2.6/h), untreated OSA patients (AHI = 48.6 ± 19.7/h) tend to have a lower CPT-MBF (1.1 ± 0.2 mL/min/g vs. 1.3 ± 0.4 mL/min/g, p = 0.09). After initiating CPAP, CPT-MBF was not different between well-treated patients (AHI <10/h) and controls (1.3 ± 0.3 mL/min/g vs. 1.3 ± 0.4 mL/min/g, p = 0.83), but it was lower for insufficiently treated patients (AHI ≥10/h) (0.9 ± 0.2 mL/min/g vs. 1.3 ± 0.4 mL/min/g, p = 0.0045). CPT-MBF was also higher in well-treated than in insufficiently treated patients (1.3 ± 0.3 mL/min/g vs. 0.9 ± 0.2 mL/min/g, p = 0.001). Mean nocturnal oxygen saturation (β = -0.55, p = 0.02) and BMI (β = -0.58, p = 0.02) were independent predictors of CPT-MBF in OSA patients. CONCLUSIONS: Coronary endothelial vasoreactivity is impaired in insufficiently treated OSA patients compared to well-treated patients and controls, confirming the need for CPAP optimization.
Resumo:
Living bacteria or yeast cells are frequently used as bioreporters for the detection of specific chemical analytes or conditions of sample toxicity. In particular, bacteria or yeast equipped with synthetic gene circuitry that allows the production of a reliable non-cognate signal (e.g., fluorescent protein or bioluminescence) in response to a defined target make robust and flexible analytical platforms. We report here how bacterial cells expressing a fluorescence reporter ("bactosensors"), which are mostly used for batch sample analysis, can be deployed for automated semi-continuous target analysis in a single concise biochip. Escherichia coli-based bactosensor cells were continuously grown in a 13 or 50 nanoliter-volume reactor on a two-layered polydimethylsiloxane-on-glass microfluidic chip. Physiologically active cells were directed from the nl-reactor to a dedicated sample exposure area, where they were concentrated and reacted in 40 minutes with the target chemical by localized emission of the fluorescent reporter signal. We demonstrate the functioning of the bactosensor-chip by the automated detection of 50 μgarsenite-As l(-1) in water on consecutive days and after a one-week constant operation. Best induction of the bactosensors of 6-9-fold to 50 μg l(-1) was found at an apparent dilution rate of 0.12 h(-1) in the 50 nl microreactor. The bactosensor chip principle could be widely applicable to construct automated monitoring devices for a variety of targets in different environments.
Resumo:
We extend Deligne's weight filtration to the integer cohomology of complex analytic spaces (endowed with an equivalence class of compactifications). In general, the weight filtration that we obtain is not part of a mixed Hodge structure. Our purely geometric proof is based on cubical descent for resolution of singularities and Poincaré-Verdier duality. Using similar techniques, we introduce the singularity filtration on the cohomology of compactificable analytic spaces. This is a new and natural analytic invariant which does not depend on the equivalence class of compactifications and is related to the weight filtration.
Resumo:
Phenomena with a constrained sample space appear frequently in practice. This is the case e.g. with strictly positive data, or with compositional data, like percentages or proportions. If the natural measure of difference is not the absolute one, simple algebraic properties show that it is more convenient to work with a geometry different from the usual Euclidean geometry in real space, and with a measure different from the usual Lebesgue measure, leading to alternative models which better fit the phenomenon under study. The general approach is presented and illustrated using the normal distribution, both on the positive real line and on the D-part simplex. The original ideas of McAlister in his introduction to the lognormal distribution in 1879, are recovered and updated
Resumo:
A continuous random variable is expanded as a sum of a sequence of uncorrelated random variables. These variables are principal dimensions in continuous scaling on a distance function, as an extension of classic scaling on a distance matrix. For a particular distance, these dimensions are principal components. Then some properties are studied and an inequality is obtained. Diagonal expansions are considered from the same continuous scaling point of view, by means of the chi-square distance. The geometric dimension of a bivariate distribution is defined and illustrated with copulas. It is shown that the dimension can have the power of continuum.
Resumo:
Today´s organizations must have the ability to react to rapid changes in the market. These rapid changes cause pressure to continuously find new efficient ways to organize work practices. Increased competition requires businesses to become more effective and to pay attention to quality of management and to make people to understand their work's impact on the final result. The fundamentals in continmuois improvement are systematic and agile tackling of indentified individual process constraints and the fact tha nothin finally improves without changes. Successful continuous improvement requires management commitment, education, implementation, measurement, recognition and regeneration. These ingredients form the foundation, both for breakthrough projects and small step ongoing improvement activities. One part of the organization's management system are the quality tools, which provide systematic methodologies for identifying problems, defining their root causes, finding solutions, gathering and sorting of data, supporting decision making and implementing the changes, and many other management tasks. Organizational change management includes processes and tools for managing the people in an organizational level change. These tools include a structured approach, which can be used for effective transition of organizations through change. When combined with the understanding of change management of individuals, these tools provide a framework for managing people in change,
Resumo:
We generalize to arbitrary waiting-time distributions some results which were previously derived for discrete distributions. We show that for any two waiting-time distributions with the same mean delay time, that with higher dispersion will lead to a faster front. Experimental data on the speed of virus infections in a plaque are correctly explained by the theoretical predictions using a Gaussian delay-time distribution, which is more realistic for this system than the Dirac delta distribution considered previously [J. Fort and V. Méndez, Phys. Rev. Lett.89, 178101 (2002)]
Resumo:
Software integration is a stage in a software development process to assemble separate components to produce a single product. It is important to manage the risks involved and being able to integrate smoothly, because software cannot be released without integrating it first. Furthermore, it has been shown that the integration and testing phase can make up 40 % of the overall project costs. These issues can be mitigated by using a software engineering practice called continuous integration. This thesis work presents how continuous integration is introduced to the author's employer organisation. This includes studying how the continuous integration process works and creating the technical basis to start using the process on future projects. The implemented system supports software written in C and C++ programming languages on Linux platform, but the general concepts can be applied to any programming language and platform by selecting the appropriate tools. The results demonstrate in detail what issues need to be solved when the process is acquired in a corporate environment. Additionally, they provide an implementation and process description suitable to the organisation. The results show that continuous integration can reduce the risks involved in a software process and increase the quality of the product as well.
Resumo:
An evaluation of the performance of a continuous flow hydride generator-nebulizer for flame atomic absorption spectrometry was carried out. Optimization of nebulizer gas flow rate, sample acid concentration, sample and tetrahydroborate uptake rates and reductant concentration, on the As and Se absorbance signals was carried out. A hydrogen-argon flame was used. An improvement of the analytical sensitivity relative to the conventional bead nebulizer used in flame AA was obtained (2 (As) and 4.8 (Se) µg L-1). Detection limits (3σb) of 1 (As) and 1.3 (Se) µg L-1 were obtained. Accuracy of the method was checked by analyzing an oyster tissue reference material.