986 resultados para Split application


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to effect permanent closure in burns patients suffering from full thickness wounds, replacing their skin via split thickness autografting, is essential. Dermal substitutes in conjunction with widely meshed split thickness autografts (+/- cultured keratinocytes) reduce scarring at the donor and recipient sites of burns patients by reducing demand for autologous skin (both surface area and thickness), without compromising dermal delivery at the wound face. Tissue engineered products such as Integra consist of a dermal template which is rapidly remodelled to form a neodermis, at which time the temporary silicone outer layer is removed and replaced with autologous split thickness skin. Whilst provision of a thick tissue engineered dermis at full thickness burn sites reduces scarring, it is hampered by delays in vascularisation which results in clinical failure. The ultimate success of any skin graft product is dependent upon a number of basic factors including adherence, haemostasis and in the case of viable tissue grafts, success is ultimately dependent upon restoration of a normal blood supply, and hence this study. Ultimately, the goal of this research is to improve the therapeutic properties of tissue replacements, through impregnation with growth factors aimed at stimulating migration and proliferation of microvascular endothelial cells into the donor tissue post grafting. For the purpose of my masters, the aim was to evaluate the responsiveness of a dermal microvascular endothelial cell line to growth factors and haemostatic factors, in the presence of the glycoprotein vitronectin. Vitronectin formed the backbone for my hypothesis and research due to its association with both epithelial and, more specifically, endothelial migration and proliferation. Early work using a platform technology referred to as VitroGro (Tissue Therapies Ltd), which is comprised of vitronectin bound BP5/IGF-1, aided keratinocyte proliferation. I hypothesised that this result would translate to another epithelium - endothelium. VitroGro had no effect on endothelial proliferation or migration. Vitronectin increases the presence of Fibroblast Growth Factor (FGF) and Vascular Endothelial Growth Factor (VEGF) receptors, enhancing cell responsiveness to their respective ligands. So, although Human Microvascular Endothelial Cell line 1 (HMEC-1) VEGF receptor expression is generally low, it was hypothesised that exposure to vitronectin would up-regulate this receptor. HMEC-1 migration, but not proliferation, was enhanced by vitronectin bound VEGF, as well as vitronectin bound Epidermal Growth Factor (EGF), both of which could be used to stimulate microvascular endothelial cell migration for the purpose of transplantation. In addition to vitronectin's synergy with various growth factors, it has also been shown to play a role in haemostasis. Vitronectin binds thrombin-antithrombin III (TAT) to form a trimeric complex that takes on many of the attributes of vitronectin, such as heparin affinity, which results in its adherence to endothelium via heparan sulfate proteoglycans (HSP), followed by unaltered transcytosis through the endothelium, and ultimately its removal from the circulation. This has been documented as a mechanism designed to remove thrombin from the circulation. Equally, it could be argued that it is a mechanism for delivering vitronectin to the matrix. My results show that matrix-bound vitronectin dramatically alters the effect that conformationally altered antithrombin three (cATIII) has on proliferation of microvascular endothelial cells. cATIII stimulates HMEC-1 proliferation in the presence of matrix-bound vitronectin, as opposed to inhibiting proliferation in its absence. Binding vitronectin to tissues and organs prior to transplant, in the presence of cATIII, will have a profound effect on microvascular infiltration of the graft, by preventing occlusion of existing vessels whilst stimulating migration and proliferation of endothelium within the tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Safety-compromising accidents occur regularly in the led outdoor activity domain. Formal accident analysis is an accepted means of understanding such events and improving safety. Despite this, there remains no universally accepted framework for collecting and analysing accident data in the led outdoor activity domain. This article presents an application of Rasmussen's risk management framework to the analysis of the Lyme Bay sea canoeing incident. This involved the development of an Accimap, the outputs of which were used to evaluate seven predictions made by the framework. The Accimap output was also compared to an analysis using an existing model from the led outdoor activity domain. In conclusion, the Accimap output was found to be more comprehensive and supported all seven of the risk management framework's predictions, suggesting that it shows promise as a theoretically underpinned approach for analysing, and learning from, accidents in the led outdoor activity domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the commonly used switching schemes for sliding mode control of power converters is analyzed and designed in the frequency domain. Particular application of a distribution static compensator (DSTATCOM) in voltage control mode is investigated in a power distribution system. Tsypkin's method and describing function is used to obtain the switching conditions for the two-level and three-level voltage source inverters. Magnitude conditions of carrier signals are developed for robust switching of the inverter under carrier-based modulation scheme of sliding mode control. The existence of border collision bifurcation is identified to avoid the complex switching states of the inverter. The load bus voltage of an unbalanced three-phase nonstiff radial distribution system is controlled using the proposed carrier-based design. The results are validated using PSCAD/EMTDC simulation studies and through a scaled laboratory model of DSTATCOM that is developed for experimental verification

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer aided technologies, medical imaging, and rapid prototyping has created new possibilities in biomedical engineering. The systematic variation of scaffold architecture as well as the mineralization inside a scaffold/bone construct can be studied using computer imaging technology and CAD/CAM and micro computed tomography (CT). In this paper, the potential of combining these technologies has been exploited in the study of scaffolds and osteochondral repair. Porosity, surface area per unit volume and the degree of interconnectivity were evaluated through imaging and computer aided manipulation of the scaffold scan data. For the osteochondral model, the spatial distribution and the degree of bone regeneration were evaluated. In this study the versatility of two softwares Mimics (Materialize), CTan and 3D realistic visualization (Skyscan) were assessed, too.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tissue engineering allows the design of functionally active cells within supportive bio-scaffolds to promote the development of new tissues such as cartilage and bone for the restoration of pathologically altered tissues. However, all bone tissue engineering applications are limited by a shortage of stem cells. The adult bone marrow stroma contains a subset of nonhematopoietic cells referred to as bone marrow mesenchymal stem cells (BMSCs). BMSCs are of interest because they are easily isolated from a small aspirate of bone marrow and readily generate single- cell-derived colonies. These cells have the capacity to undergo extensive replication in an undifferentiated state ex vivo. In addition, BMSCs have the potential to develop either in vitro or in vivo into distinct mesenchymal tissues, including bone, cartilage, fat, tendon, muscle, and marrow stroma. Thus, BMSCs are an attractive cell source for tissue engineering approaches. However, BMSCs are not homo- geneous and the quantity of stem cells decreases in the bone marrow in aged population. A sequential loss of lineage differentiation potential has been found in the mixed culture of bone marrow stromal cells due to a heterogenous popu- lation. Therefore, a number of studies have proposed that homogenous bone marrow stem cells can be generated from clonal culture of bone marrow cells and that BMSC clones have the greatest potential for the application of bone regeneration in vivo

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital collections are growing exponentially in size as the information age takes a firm grip on all aspects of society. As a result Information Retrieval (IR) has become an increasingly important area of research. It promises to provide new and more effective ways for users to find information relevant to their search intentions. Document clustering is one of the many tools in the IR toolbox and is far from being perfected. It groups documents that share common features. This grouping allows a user to quickly identify relevant information. If these groups are misleading then valuable information can accidentally be ignored. There- fore, the study and analysis of the quality of document clustering is important. With more and more digital information available, the performance of these algorithms is also of interest. An algorithm with a time complexity of O(n2) can quickly become impractical when clustering a corpus containing millions of documents. Therefore, the investigation of algorithms and data structures to perform clustering in an efficient manner is vital to its success as an IR tool. Document classification is another tool frequently used in the IR field. It predicts categories of new documents based on an existing database of (doc- ument, category) pairs. Support Vector Machines (SVM) have been found to be effective when classifying text documents. As the algorithms for classifica- tion are both efficient and of high quality, the largest gains can be made from improvements to representation. Document representations are vital for both clustering and classification. Representations exploit the content and structure of documents. Dimensionality reduction can improve the effectiveness of existing representations in terms of quality and run-time performance. Research into these areas is another way to improve the efficiency and quality of clustering and classification results. Evaluating document clustering is a difficult task. Intrinsic measures of quality such as distortion only indicate how well an algorithm minimised a sim- ilarity function in a particular vector space. Intrinsic comparisons are inherently limited by the given representation and are not comparable between different representations. Extrinsic measures of quality compare a clustering solution to a “ground truth” solution. This allows comparison between different approaches. As the “ground truth” is created by humans it can suffer from the fact that not every human interprets a topic in the same manner. Whether a document belongs to a particular topic or not can be subjective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An automated gas sampling methodology has been used to estimate nitrous oxide (N2O) emissions from heavy black clay soil in northern Australia where split applications of urea were applied to furrow irrigated cotton. Nitrous oxide emissions from the beds were 643 g N/ha over the 188 day measurement period (after planting), whilst the N2O emissions from the furrows were significantly higher at 967 g N/ha. The DNDC model was used to develop a full season simulation of N2O and N2 emissions. Seasonal N2O emissions were equivalent to 0.83% of applied N, with total gaseous N losses (excluding NH3) estimated to be 16% of the applied N.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neural networks (NNs) are discussed in connection with their possible use in induction machine drives. The mathematical model of the NN as well as a commonly used learning algorithm is presented. Possible applications of NNs to induction machine control are discussed. A simulation of an NN successfully identifying the nonlinear multivariable model of an induction-machine stator transfer function is presented. Previously published applications are discussed, and some possible future applications are proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design teams are confronted with the quandary of choosing apposite building control systems to suit the needs of particular intelligent building projects, due to the availability of innumerable ‘intelligent’ building products and a dearth of inclusive evaluation tools. This paper is organised to develop a model for facilitating the selection evaluation for intelligent HVAC control systems for commercial intelligent buildings. To achieve these objectives, systematic research activities have been conducted to first develop, test and refine the general conceptual model using consecutive surveys; then, to convert the developed conceptual framework into a practical model; and, finally, to evaluate the effectiveness of the model by means of expert validation. The results of the surveys are that ‘total energy use’ is perceived as the top selection criterion, followed by the‘system reliability and stability’, ‘operating and maintenance costs’, and ‘control of indoor humidity and temperature’. This research not only presents a systematic and structured approach to evaluate candidate intelligent HVAC control system against the critical selection criteria (CSC), but it also suggests a benchmark for the selection of one control system candidate against another.