51 resultados para free thyroxine index
Resumo:
In this paper we prove a formula for the analytic index of a basic Dirac-type operator on a Riemannian foliation, solving a problem that has been open for many years. We also consider more general indices given by twisting the basic Dirac operator by a representation of the orthogonal group. The formula is a sum of integrals over blowups of the strata of the foliation and also involves eta invariants of associated elliptic operators. As a special case, a Gauss-Bonnet formula for the basic Euler characteristic is obtained using two independent proofs.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
The decision to publish educational materials openly and under free licenses brings up the challenge of doing it in a sustainable way. Some lessons can be learned from the business models for production, maintenance and distribution of Free and Open Source Software. The Free Technology Academy (FTA) has taken on these challenges and has implemented some of these models. We briefly review the FTA educational programme, methodologies and organisation, and see to which extent these models are proving successful in the case of the FTA.
Resumo:
Open educational resource (OER) initiatives have made the shift from being a fringe activity to one that is increasingly considered as a key component in both teaching and learning in higher education and in the fulfilment of universities' mission and goals. Although the reduction in the cost of materials is often cited as a potential benefit of OER, this potential benefit has not yet been realised in practice necessitating thoughtful consideration of various strategies for new OER initiatives such as the OpenContent directory at the University of Cape Town (UCT) in South Africa.This paper reviews the range of sustainability strategies mentioned in the literature, plots the results of a small-scale OER sustainability survey against these strategies and explains how these findings and other papers on OER initiatives were used to inform an in-house workshop at UCT to deliberate the future strategy for the sustainability of OER at UCT.
Resumo:
In this project a research both in finding predictors via clustering techniques and in reviewing the Data Mining free software is achieved. The research is based in a case of study, from where additionally to the KDD free software used by the scientific community; a new free tool for pre-processing the data is presented. The predictors are intended for the e-learning domain as the data from where these predictors have to be inferred are student qualifications from different e-learning environments. Through our case of study not only clustering algorithms are tested but also additional goals are proposed.
Resumo:
Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services
Resumo:
This paper describes a systematic research about free software solutions and techniques for art imagery computer recognition problem.
Resumo:
The article presents and discusses estimates of social and economic indicators for Italy’s regions in benchmark years roughly from Unification to the present day: life expectancy, education, GDP per capita at purchasing power parity, and the new Human Development Index (HDI). A broad interpretative hypothesis, based on the distinction between passive and active modernization, is proposed to account for the evolution of regional imbalances over the long-run. In the lack of active modernization, Southern Italy converged thanks to passive modernization, i.e., State intervention: however, this was more effective in life expectancy, less successful in education, expensive and as a whole ineffective in GDP. As a consequence, convergence in the HDI occurred from the late XIX century to the 1970s, but came to a sudden halt in the last decades of the XX century.
Resumo:
The availability of induced pluripotent stem cells (iPSCs)has created extraordinary opportunities for modeling andperhaps treating human disease. However, all reprogrammingprotocols used to date involve the use of products of animal origin. Here, we set out to develop a protocol to generate and maintain human iPSC that would be entirelydevoid of xenobiotics. We first developed a xeno-free cellculture media that supported the long-term propagation of human embryonic stem cells (hESCs) to a similar extent as conventional media containing animal origin products or commercially available xeno-free medium. We also derivedprimary cultures of human dermal fibroblasts under strictxeno-free conditions (XF-HFF), and we show that they can be used as both the cell source for iPSC generation as well as autologous feeder cells to support their growth. We also replaced other reagents of animal origin trypsin, gelatin, matrigel) with their recombinant equivalents. Finally, we used vesicular stomatitis virus G-pseudotyped retroviral particles expressing a polycistronic construct encoding Oct4, Sox2, Klf4, and GFP to reprogram XF-HFF cells under xeno-free conditions. A total of 10 xeno-free humaniPSC lines were generated, which could be continuously passaged in xeno-free conditions and aintained characteristics indistinguishable from hESCs, including colonymorphology and growth behavior, expression of pluripotency-associated markers, and pluripotent differentiationability in vitro and in teratoma assays. Overall, the resultspresented here demonstrate that human iPSCs can be generatedand maintained under strict xeno-free conditions and provide a path to good manufacturing practice (GMP) applicability that should facilitate the clinical translation of iPSC-based therapies.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
This paper describes a Computer-Supported Collaborative Learning (CSCL) case study in engineering education carried out within the context of a network management course. The case study shows that the use of two computing tools developed by the authors and based on Free- and Open-Source Software (FOSS) provide significant educational benefits over traditional engineering pedagogical approaches in terms of both concepts and engineering competencies acquisition. First, the Collage authoring tool guides and supports the course teacher in the process of authoring computer-interpretable representations (using the IMS Learning Design standard notation) of effective collaborative pedagogical designs. Besides, the Gridcole system supports the enactment of that design by guiding the students throughout the prescribed sequence of learning activities. The paper introduces the goals and context of the case study, elaborates onhow Collage and Gridcole were employed, describes the applied evaluation methodology, anddiscusses the most significant findings derived from the case study.
Resumo:
In much of the western world, and particularly in Europe, there is a widespread perception that multiculturalism has ‘failed’ and that governments who once embraced a multicultural approach to diversity are turning away, adopting a strong emphasis on civic integration. This reaction, we are told, “reflects a seismic shift not just in the Netherlands, but in other European countries as well” (JOPPKE 2007). This paper challenges this view. Drawing on an updated version of the Multiculturalism Policy Index introduced earlier (BANTING and KYMLICKA 2006), the paper presents an index of the strength of multicultural policies for European countries and several traditional countries of immigration at three points in time (1980, 2000 and 2010). The results paint a different picture of contemporary experience in Europe. While a small number of countries, including most notably the Netherlands, have weakened established multicultural policies during the 2000s, such a shift is the exception. Most countries that adopted multicultural approaches in the later part of the twentieth century have maintained their programs in the first decade of the new century; and a significant number of countries have added new ones. In much of Europe, multicultural policies are not in general retreat. As a result, the turn to civic integration is often being layered on top of existing multicultural programs, leading to a blended approach to diversity. The paper reflects on the compatibility of multiculturalism policies and civic integration, arguing that more liberal forms of civic integration can be combined with multiculturalism but that more illiberal or coercive forms are incompatible with a multicultural approach.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
Using a new dataset on capital account openness, we investigate why equity return correlations changed over the last century. Based on a new, long-run dataset on capital account regulations in a group of 16 countries over the period 1890-2001, we show that correlations increase as financial markets are liberalized. These findings are robust to controlling for both the Forbes-Rigobon bias and global averages in equity return correlations. We test the robustness of our conclusions, and show that greater synchronization of fundamentals is not the main cause of increasing correlations. These results imply that the home bias puzzle may be smaller than traditionally claimed.
Resumo:
This paper proposes a method to conduct inference in panel VAR models with cross unit interdependencies and time variations in the coefficients. The approach can be used to obtain multi-unit forecasts and leading indicators and to conduct policy analysis in a multiunit setups. The framework of analysis is Bayesian and MCMC methods are used to estimate the posterior distribution of the features of interest. The model is reparametrized to resemble an observable index model and specification searches are discussed. As an example, we construct leading indicators for inflation and GDP growth in the Euro area using G-7 information.