74 resultados para data sheets

em Universidade do Minho


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the biggest concerns in the Tissue Engineering field is the correct vascularization of engineered constructs. Strategies involving the use of endothelial cells are promising but adequate cell sourcing and neo-vessels stability are enduring challenges. In this work, we propose the hypoxic pre-conditioning of the stromal vascular fraction (SVF) of human adipose tissue to obtain highly angiogenic cell sheets (CS). For that, SVF was isolated after enzymatic dissociation of adipose tissue and cultured until CS formation in normoxic (pO2=21%) and hypoxic (pO2=5%) conditions for 5 and 8 days, in basal medium. Immunocytochemistry against CD31 and CD146 revealed the presence of highly branched capillary-like structures, which were far more complex for hypoxia. ELISA quantification showed increased VEGF and TIMP-1 secretion in hypoxia for 8 days of culture. In a Matrigel assay, the formation of capillary-like structures by endothelial cells was more prominent when cultured in conditioned medium recovered from the cultures in hypoxia. The same conditioned medium increased the migration of adipose stromal cells in a scratch assay, when compared with the medium from normoxia. Histological analysis after implantation of 8 days normoxic- and hypoxic-conditioned SVF CS in a hindlimb ischemia murine model showed improved formation of neo-blood vessels. Furthermore, Laser Doppler results demonstrated that the blood perfusion of the injured limb after 30 days was enhanced for the hypoxic CS group. Overall, these results suggest that SVF CS created under hypoxia can be used as functional vascularization units for tissue engineering and regenerative medicine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tendon's regeneration is limited, demanding for cell-based strategies to fully restore their functionality upon injury. The concept of magnetic force-based TE(1), generally using magnetic nanoparticles may enable, for example, stem cell stimulation and/or remote control over TE constructs. Thus, we originally propose the development of magnetic cell sheets (magCSs) with tenogenic capability, aimed at promoting tendon's regeneration. A Tenomodulin (TNMD+) subpopulation was sorted from human adipose stem cells (hASCs), using TNMD-coated immunomagnetic beads(2) and used as cell source for the development of magCSs. Briefly, cells were labeled with iron oxide composite particles (Micromod) and cultured for 7 days in α-MEM medium with or without magnetic stimulation provided by a magnetic device (nanoTherics). CSs were retrieved from the plates using magnet attraction as contiguous sheets of cells within its own deposited ECM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cell Sheets of hASCs (hASCs-CS) have been previously proposed for wound healing applications(1, 2) and despite the concern for production time reduction, the possibility of having these hASCs-CS off-the-shelf is appealing. The goal of this work was to define a cryopreservation methodology allowing to preserve cells viability and the properties CS matrix. hASCs-CS obtained from three different donors were created in UP-cell thermoresponsive dishes(Nunc, Germany) as previously reported(1,2). Different cryopreservation conditions were considered: i)FBS plus DMSO(5% and10%); ii)0.4M of Trehalose plus DMSO (5% and 10%); iii)cryosolution PLL (Akron Biotech, USA); and iv)vitrification. The cryopreservation effect was first assessed for cellular viability by flow cytometry using 7-AAD, and after dissociating the hASCs-CS with collagenase and trypsin-EDTA 0.25%. The expression (RT-PCR) and deposition (western blot and immunocytochemistry) of collagen type I, laminin and fibronectin, and the organization (TEM) of the extracellular matrix was further assessed before and after hASCs-CS cryopreservation to determine a potential effect of the method over matrix composition and integrity. The obtained results confirmed that cell viability is affected by the cryopreservation methodology, as shown before for different CS(3). Interestingly, the matrix properties were not significantly altered and the typical cell sheetâ s easiness of manipulation for transplantation was not lost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As huge amounts of data become available in organizations and society, specific data analytics skills and techniques are needed to explore this data and extract from it useful patterns, tendencies, models or other useful knowledge, which could be used to support the decision-making process, to define new strategies or to understand what is happening in a specific field. Only with a deep understanding of a phenomenon it is possible to fight it. In this paper, a data-driven analytics approach is used for the analysis of the increasing incidence of fatalities by pneumonia in the Portuguese population, characterizing the disease and its incidence in terms of fatalities, knowledge that can be used to define appropriate strategies that can aim to reduce this phenomenon, which has increased more than 65% in a decade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cell/cell-extracellular matrix (ECM) dynamic interactions appear to have a major role in regulating communication through soluble signaling, directing cell binding and activating substrates that participate in the highly organized wound healing process. Moreover, these interactions are also crucial for in vitro mimicking cutaneous physiology. Herein we explore cell sheet (CS) engineering to create cellular constructs formed by keratinocytes (hKC), fibroblasts (hDFB) and dermal microvascular endothelial cells (hDMEC), to target skin wound healing but also the in vitro recreation of relevant models. Taking advantage of temperature-responsive culture surfaces, which allow harvesting cultured cells as intact sheets along with the deposited native ECM, varied combinations of homotypic and heterotypic three-dimensional (3-D) CS-based constructs were developed. Constructs combining one CS of keratinocytes as an epidermis-like layer plus a vascularized dermis composed by hDFB and hDMECs were assembled as skin analogues for advancing in vitro testing. Simultaneously both hKC and hDMEC were shown to significantly contribute to the re-epithelialization of full-thickness mice skin wounds by promoting an early epithelial coverage, while hDMEC significantly lead to increased vessels density, incorporating the neovasculature. Thus, although determined by the cellular nature of the constructs, these outcomes demonstrated that CS engineering appear as an unique technology that open the possibility to create numerous combinations of 3D constructs to target defective wound healing as well as the construction of in vitro models to further mimic cutaneous functions crucial for drug screening and cosmetic testing assays.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology based on the Bayesian data fusion techniques applied to non-destructive and destructive tests for the structural assessment of historical constructions. The aim of the methodology is to reduce the uncertainties of the parameter estimation. The Young's modulus of granite stones was chosen as an example for the present paper. The methodology considers several levels of uncertainty since the parameters of interest are considered random variables with random moments. A new concept of Trust Factor was introduced to affect the uncertainty related to each test results, translated by their standard deviation, depending on the higher or lower reliability of each test to predict a certain parameter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hospitals are nowadays collecting vast amounts of data related with patient records. All this data hold valuable knowledge that can be used to improve hospital decision making. Data mining techniques aim precisely at the extraction of useful knowledge from raw data. This work describes an implementation of a medical data mining project approach based on the CRISP-DM methodology. Recent real-world data, from 2000 to 2013, were collected from a Portuguese hospital and related with inpatient hospitalization. The goal was to predict generic hospital Length Of Stay based on indicators that are commonly available at the hospitalization process (e.g., gender, age, episode type, medical specialty). At the data preparation stage, the data were cleaned and variables were selected and transformed, leading to 14 inputs. Next, at the modeling stage, a regression approach was adopted, where six learning methods were compared: Average Prediction, Multiple Regression, Decision Tree, Artificial Neural Network ensemble, Support Vector Machine and Random Forest. The best learning model was obtained by the Random Forest method, which presents a high quality coefficient of determination value (0.81). This model was then opened by using a sensitivity analysis procedure that revealed three influential input attributes: the hospital episode type, the physical service where the patient is hospitalized and the associated medical specialty. Such extracted knowledge confirmed that the obtained predictive model is credible and with potential value for supporting decisions of hospital managers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As increasingly more sophisticated materials and products are being developed and times-to-market need to be minimized, it is important to make available fast response characterization tools using small amounts of sample, capable of conveying data on the relationships between rheological response, process-induced material structure and product characteristics. For this purpose, a single / twin-screw mini-extrusion system of modular construction, with well-controlled outputs in the range 30-300 g/h, was coupled to a in- house developed rheo-optical slit die able to measure shear viscosity and normal-stress differences, as well as performing rheo-optical experiments, namely small angle light scattering (SALS) and polarized optical microscopy (POM). In addition, the mini-extruder is equipped with ports that allow sample collection, and the extrudate can be further processed into products to be tested later. Here, we present the concept and experimental set-up [1, 2]. As a typical application, we report on the characterization of the processing of a polymer blend and of the properties of extruded sheets. The morphological evolution of a PS/PMMA industrial blend along the extruder, the flow-induced structures developed and the corresponding rheological characteristics are presented, together with the mechanical and structural characteristics of produced sheets. The application of this experimental tool to other material systems will also be discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Earthworks tasks aim at levelling the ground surface at a target construction area and precede any kind of structural construction (e.g., road and railway construction). It is comprised of sequential tasks, such as excavation, transportation, spreading and compaction, and it is strongly based on heavy mechanical equipment and repetitive processes. Under this context, it is essential to optimize the usage of all available resources under two key criteria: the costs and duration of earthwork projects. In this paper, we present an integrated system that uses two artificial intelligence based techniques: data mining and evolutionary multi-objective optimization. The former is used to build data-driven models capable of providing realistic estimates of resource productivity, while the latter is used to optimize resource allocation considering the two main earthwork objectives (duration and cost). Experiments held using real-world data, from a construction site, have shown that the proposed system is competitive when compared with current manual earthwork design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We are living in the era of Big Data. A time which is characterized by the continuous creation of vast amounts of data, originated from different sources, and with different formats. First, with the rise of the social networks and, more recently, with the advent of the Internet of Things (IoT), in which everyone and (eventually) everything is linked to the Internet, data with enormous potential for organizations is being continuously generated. In order to be more competitive, organizations want to access and explore all the richness that is present in those data. Indeed, Big Data is only as valuable as the insights organizations gather from it to make better decisions, which is the main goal of Business Intelligence. In this paper we describe an experiment in which data obtained from a NoSQL data source (database technology explicitly developed to deal with the specificities of Big Data) is used to feed a Business Intelligence solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relatório de estágio de mestrado em Ensino do 1º e 2º Ciclo do Ensino Básico

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies in Computational Intelligence, 616

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last few years many research efforts have been done to improve the design of ETL (Extract-Transform-Load) systems. ETL systems are considered very time-consuming, error-prone and complex involving several participants from different knowledge domains. ETL processes are one of the most important components of a data warehousing system that are strongly influenced by the complexity of business requirements, their changing and evolution. These aspects influence not only the structure of a data warehouse but also the structures of the data sources involved with. To minimize the negative impact of such variables, we propose the use of ETL patterns to build specific ETL packages. In this paper, we formalize this approach using BPMN (Business Process Modelling Language) for modelling more conceptual ETL workflows, mapping them to real execution primitives through the use of a domain-specific language that allows for the generation of specific instances that can be executed in an ETL commercial tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Os recursos computacionais exigidos durante o processamento de grandes volumes de dados durante um processo de povoamento de um data warehouse faz com que a necessidade da procura de novas implementações tenha também em atenção a eficiência energética dos diversos componentes processuais que integram um qualquer sistema de povoamento. A lacuna de técnicas ou metodologias para categorizar e avaliar o consumo de energia em sistemas de povoamento de data warehouses é claramente notória. O acesso a esse tipo de informação possibilitaria a construção de sistemas de povoamento de data warehouses com níveis de consumo de energia mais baixos e, portanto, mais eficientes. Partindo da adaptação de técnicas aplicadas a sistemas de gestão de base de dados para a obtenção dos consumos energéticos da execução de interrogações, desenhámos e implementámos uma nova técnica que nos permite obter os consumos de energia para um qualquer processo de povoamento de um data warehouse, através da avaliação do consumo de cada um dos componentes utilizados na sua implementação utilizando uma ferramenta convencional. Neste artigo apresentamos a forma como fazemos tal avaliação, utilizando na demonstração da viabilidade da nossa proposta um processo de povoamento bastante típico em data warehouses – substituição encadeada de chaves operacionais -, que foi implementado através da ferramenta Kettle.