80 resultados para Free viewing
Resumo:
The availability of induced pluripotent stem cells (iPSCs)has created extraordinary opportunities for modeling andperhaps treating human disease. However, all reprogrammingprotocols used to date involve the use of products of animal origin. Here, we set out to develop a protocol to generate and maintain human iPSC that would be entirelydevoid of xenobiotics. We first developed a xeno-free cellculture media that supported the long-term propagation of human embryonic stem cells (hESCs) to a similar extent as conventional media containing animal origin products or commercially available xeno-free medium. We also derivedprimary cultures of human dermal fibroblasts under strictxeno-free conditions (XF-HFF), and we show that they can be used as both the cell source for iPSC generation as well as autologous feeder cells to support their growth. We also replaced other reagents of animal origin trypsin, gelatin, matrigel) with their recombinant equivalents. Finally, we used vesicular stomatitis virus G-pseudotyped retroviral particles expressing a polycistronic construct encoding Oct4, Sox2, Klf4, and GFP to reprogram XF-HFF cells under xeno-free conditions. A total of 10 xeno-free humaniPSC lines were generated, which could be continuously passaged in xeno-free conditions and aintained characteristics indistinguishable from hESCs, including colonymorphology and growth behavior, expression of pluripotency-associated markers, and pluripotent differentiationability in vitro and in teratoma assays. Overall, the resultspresented here demonstrate that human iPSCs can be generatedand maintained under strict xeno-free conditions and provide a path to good manufacturing practice (GMP) applicability that should facilitate the clinical translation of iPSC-based therapies.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
This paper describes a Computer-Supported Collaborative Learning (CSCL) case study in engineering education carried out within the context of a network management course. The case study shows that the use of two computing tools developed by the authors and based on Free- and Open-Source Software (FOSS) provide significant educational benefits over traditional engineering pedagogical approaches in terms of both concepts and engineering competencies acquisition. First, the Collage authoring tool guides and supports the course teacher in the process of authoring computer-interpretable representations (using the IMS Learning Design standard notation) of effective collaborative pedagogical designs. Besides, the Gridcole system supports the enactment of that design by guiding the students throughout the prescribed sequence of learning activities. The paper introduces the goals and context of the case study, elaborates onhow Collage and Gridcole were employed, describes the applied evaluation methodology, anddiscusses the most significant findings derived from the case study.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
Using a new dataset on capital account openness, we investigate why equity return correlations changed over the last century. Based on a new, long-run dataset on capital account regulations in a group of 16 countries over the period 1890-2001, we show that correlations increase as financial markets are liberalized. These findings are robust to controlling for both the Forbes-Rigobon bias and global averages in equity return correlations. We test the robustness of our conclusions, and show that greater synchronization of fundamentals is not the main cause of increasing correlations. These results imply that the home bias puzzle may be smaller than traditionally claimed.
Resumo:
This paper examines the relationship between the equity premium and the risk free rate at three different maturities using post 1973 data fora panel of 7 OECD countries. We show the existence of subsample instabilities,of some cross country differences and of inconsistencies with the expectations theory of the term structure. We perform simulations using a standard consumptionbased CAPM model and demonstrate that the basic features of Mehra and Prescott's(1985) puzzle remain, regardless of the time period, the investment maturity and the country considered. Modifications of the basic setup are also considered.
Resumo:
Traditional economic wisdom says that free entry in a market will drive profits down to zero. This conclusion is usually drawn under the assumption of perfect information. We assumethat a priori there exists imperfect information about theprofitability of the market, but that potential entrants maylearn the demand curve perfectly at negligible cost byengaging in market research. Even if in equilibrium firmslearn the demand perfectly, profits may be strictly positivebecause of insufficient entry. The mere fact that it will notbecome common knowledge that every entrant has perfectinformation about demand causes this surprising result. Belief means doubt. Knowing means certainty. Introduction to the Kabalah.
Resumo:
This letter discusses the detection and correction ofresidual motion errors that appear in airborne synthetic apertureradar (SAR) interferograms due to the lack of precision in the navigationsystem. As it is shown, the effect of this lack of precision istwofold: azimuth registration errors and phase azimuth undulations.Up to now, the correction of the former was carried out byestimating the registration error and interpolating, while the latterwas based on the estimation of the phase azimuth undulations tocompensate the phase of the computed interferogram. In this letter,a new correction method is proposed, which avoids the interpolationstep and corrects at the same time the azimuth phase undulations.Additionally, the spectral diversity technique, used to estimateregistration errors, is critically analyzed. Airborne L-bandrepeat-pass interferometric data of the German Aerospace Center(DLR) experimental airborne SAR is used to validate the method
Resumo:
We present a georeferenced photomosaic of the Lucky Strike hydrothermal vent field (Mid-Atlantic Ridge, 37°18’N). The photomosaic was generated from digital photographs acquired using the ARGO II seafloor imaging system during the 1996 LUSTRE cruise, which surveyed a ~1 km2 zone and provided a coverage of ~20% of the seafloor. The photomosaic has a pixel resolution of 15 mm and encloses the areas with known active hydrothermal venting. The final mosaic is generated after an optimization that includes the automatic detection of the same benthic features across different images (feature-matching), followed by a global alignment of images based on the vehicle navigation. We also provide software to construct mosaics from large sets of images for which georeferencing information exists (location, attitude, and altitude per image), to visualize them, and to extract data. Georeferencing information can be provided by the raw navigation data (collected during the survey) or result from the optimization obtained from imatge matching. Mosaics based solely on navigation can be readily generated by any user but the optimization and global alignment of the mosaic requires a case-by-case approach for which no universally software is available. The Lucky Strike photomosaics (optimized and navigated-only) are publicly available through the Marine Geoscience Data System (MGDS, http://www.marine-geo.org). The mosaic-generating and viewing software is available through the Computer Vision and Robotics Group Web page at the University of Girona (http://eia.udg.es/_rafa/mosaicviewer.html)
Resumo:
The aim of this brief is to present an original design methodology that permits implementing latch-up-free smart power circuits on a very simple, cost-effective technology. The basic concept used for this purpose is letting float the wells of the MOS transistors most susceptible to initiate latch-up.
Resumo:
This paper derives the HJB (Hamilton-Jacobi-Bellman) equation for sophisticated agents in a finite horizon dynamic optimization problem with non-constant discounting in a continuous setting, by using a dynamic programming approach. A simple example is used in order to illustrate the applicability of this HJB equation, by suggesting a method for constructing the subgame perfect equilibrium solution to the problem.Conditions for the observational equivalence with an associated problem with constantdiscounting are analyzed. Special attention is paid to the case of free terminal time. Strotz¿s model (an eating cake problem of a nonrenewable resource with non-constant discounting) is revisited.
Resumo:
The density and excitation energy dependence of symmetry energy and symmetry free energy for finite nuclei are calculated microscopically in a microcanonical framework, taking into account thermal and expansion effects. A finite-range momentum and density-dependent two-body effective interaction is employed for this purpose. The role of mass, isospin, and equation of state (EOS) on these quantities is also investigated; our calculated results are in consonance with the available experimental data.
Resumo:
The gauge-invariant actions for open and closed free bosonic string field theories are obtained from the string field equations in the conformal gauge using the cohomology operations of Banks and Peskin. For the closed-string theory no restrictions are imposed on the gauge parameters.
Resumo:
A simple expression for the Gibbs free energy of formation of a pure component or a eutectic alloy glass, relative to the stable crystalline phase (or phases) at the same temperature is deduced by use of thermodynamic arguments. The expression obtained is supposed to apply to both monocomponent and multicomponent liquid alloys that might become glasses from the supercooled liquid state, irrespective of the critical cooling rate needed to avoid crystallization.