562 resultados para micro-process-engineering


Relevância:

40.00% 40.00%

Publicador:

Resumo:

n the field of tissue engineering new polymers are needed to fabricate scaffolds with specific properties depending on the targeted tissue. This work aimed at designing and developing a 3D scaffold with variable mechanical strength, fully interconnected porous network, controllable hydrophilicity and degradability. For this, a desktop-robot-based melt-extrusion rapid prototyping technique was applied to a novel tri-block co-polymer, namely poly(ethylene glycol)-block-poly(epsi-caprolactone)-block-poly(DL-lactide), PEG-PCL-P(DL)LA. This co-polymer was melted by electrical heating and directly extruded out using computer-controlled rapid prototyping by means of compressed purified air to build porous scaffolds. Various lay-down patterns (0/30/60/90/120/150°, 0/45/90/135°, 0/60/120° and 0/90°) were produced by using appropriate positioning of the robotic control system. Scanning electron microscopy and micro-computed tomography were used to show that 3D scaffold architectures were honeycomb-like with completely interconnected and controlled channel characteristics. Compression tests were performed and the data obtained agreed well with the typical behavior of a porous material undergoing deformation. Preliminary cell response to the as-fabricated scaffolds has been studied with primary human fibroblasts. The results demonstrated the suitability of the process and the cell biocompatibility of the polymer, two important properties among the many required for effective clinical use and efficient tissue-engineering scaffolding.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Computer aided technologies, medical imaging, and rapid prototyping has created new possibilities in biomedical engineering. The systematic variation of scaffold architecture as well as the mineralization inside a scaffold/bone construct can be studied using computer imaging technology and CAD/CAM and micro computed tomography (CT). In this paper, the potential of combining these technologies has been exploited in the study of scaffolds and osteochondral repair. Porosity, surface area per unit volume and the degree of interconnectivity were evaluated through imaging and computer aided manipulation of the scaffold scan data. For the osteochondral model, the spatial distribution and the degree of bone regeneration were evaluated. In this study the versatility of two softwares Mimics (Materialize), CTan and 3D realistic visualization (Skyscan) were assessed, too.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Rapid prototyping (RP) is a common name for several techniques, which read in data from computer-aided design (CAD) drawings and manufacture automatically threedimensional objects layer-by-layer according to the virtual design. The utilization of RP in tissue engineering enables the production of three-dimensional scaffolds with complex geometries and very fine structures. Adding micro- and nanometer details into the scaffolds improves the mechanical properties of the scaffold and ensures better cell adhesion to the scaffold surface. Thus, tissue engineering constructs can be customized according to the data acquired from the medical scans to match the each patient’s individual needs. In addition RP enables the control of the scaffold porosity making it possible to fabricate applications with desired structural integrity. Unfortunately, every RP process has its own unique disadvantages in building tissue engineering scaffolds. Hence, the future research should be focused into the development of RP machines designed specifically for fabrication of tissue engineering scaffolds, although RP methods already can serve as a link between tissue and engineering.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A novel method was developed for a quantitative assessment of pore interconnectivity using micro-CT data. This method makes use of simulated spherical particles, percolating through the interconnected pore network. For each sphere diameter, the accessible pore volume is calculated. This algorithm was applied to compare pore interconnectivity of two different scaffold architectures; one created by salt-leaching and the other by stereolithography. The algorithm revealed a much higher pore interconnectivity for the latter one.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Calibration process in micro-simulation is an extremely complicated phenomenon. The difficulties are more prevalent if the process encompasses fitting aggregate and disaggregate parameters e.g. travel time and headway. The current practice in calibration is more at aggregate level, for example travel time comparison. Such practices are popular to assess network performance. Though these applications are significant there is another stream of micro-simulated calibration, at disaggregate level. This study will focus on such microcalibration exercise-key to better comprehend motorway traffic risk level, management of variable speed limit (VSL) and ramp metering (RM) techniques. Selected section of Pacific Motorway in Brisbane will be used as a case study. The discussion will primarily incorporate the critical issues encountered during parameter adjustment exercise (e.g. vehicular, driving behaviour) with reference to key traffic performance indicators like speed, lane distribution and headway; at specific motorway points. The endeavour is to highlight the utility and implications of such disaggregate level simulation for improved traffic prediction studies. The aspects of calibrating for points in comparison to that for whole of the network will also be briefly addressed to examine the critical issues such as the suitability of local calibration at global scale. The paper will be of interest to transport professionals in Australia/New Zealand where micro-simulation in particular at point level, is still comparatively a less explored territory in motorway management.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Calibration process in micro-simulation is an extremely complicated phenomenon. The difficulties are more prevalent if the process encompasses fitting aggregate and disaggregate parameters e.g. travel time and headway. The current practice in calibration is more at aggregate level, for example travel time comparison. Such practices are popular to assess network performance. Though these applications are significant there is another stream of micro-simulated calibration, at disaggregate level. This study will focus on such micro-calibration exercise-key to better comprehend motorway traffic risk level, management of variable speed limit (VSL) and ramp metering (RM) techniques. Selected section of Pacific Motorway in Brisbane will be used as a case study. The discussion will primarily incorporate the critical issues encountered during parameter adjustment exercise (e.g. vehicular, driving behaviour) with reference to key traffic performance indicators like speed, land distribution and headway; at specific motorway points. The endeavour is to highlight the utility and implications of such disaggregate level simulation for improved traffic prediction studies. The aspects of calibrating for points in comparison to that for whole of the network will also be briefly addressed to examine the critical issues such as the suitability of local calibration at global scale. The paper will be of interest to transport professionals in Australia/New Zealand where micro-simulation in particular at point level, is still comparatively a less explored territory in motorway management.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this editorial letter, we provide the readers of Information Systems with a birds-eye introduction to Process-aware Information Systems (PAIS) – a sub-field of Information Systems that has drawn growing attention in the past two decades, both as an engineering and as a management discipline. Against this backdrop, we briefly discuss how the papers included in this special issue contribute to extending the body of knowledge in this field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Sustainability has emerged as a primary context for engineering education in the 21st Century, particularly the sub-discipline of chemical engineering. However, there is confusion over how to go about integrating sustainability knowledge and skills systemically within bachelor degrees. This paper addresses this challenge, using a case study of an Australian chemical engineering degree to highlight important practical considerations for embedding sustainability at the core of the curriculum. The paper begins with context for considering a systematic process for rapid curriculum renewal. The authors then summarise a 2-year federally funded project, which comprised piloting a model for rapid curriculum renewal led by the chemical engineering staff. Model elements contributing to the renewal of this engineering degree and described in this paper include: industry outreach; staff professional development; attribute identification and alignment; program mapping; and curriculum and teaching resource development. Personal reflections on the progress and process of rapid curriculum renewal in sustainability by the authors and participating engineering staff will be presented as a means to discuss and identify methodological improvements, as well as highlight barriers to project implementation. It is hoped that this paper will provide an example of a formalised methodology on which program reform and curriculum renewal for sustainability can be built upon in other higher education institutions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We conducted an in-situ X-ray micro-computed tomography heating experiment at the Advanced Photon Source (USA) to dehydrate an unconfined 2.3 mm diameter cylinder of Volterra Gypsum. We used a purpose-built X-ray transparent furnace to heat the sample to 388 K for a total of 310 min to acquire a three-dimensional time-series tomography dataset comprising nine time steps. The voxel size of 2.2 μm3 proved sufficient to pinpoint reaction initiation and the organization of drainage architecture in space and time. We observed that dehydration commences across a narrow front, which propagates from the margins to the centre of the sample in more than four hours. The advance of this front can be fitted with a square-root function, implying that the initiation of the reaction in the sample can be described as a diffusion process. Novel parallelized computer codes allow quantifying the geometry of the porosity and the drainage architecture from the very large tomographic datasets (20483 voxels) in unprecedented detail. We determined position, volume, shape and orientation of each resolvable pore and tracked these properties over the duration of the experiment. We found that the pore-size distribution follows a power law. Pores tend to be anisotropic but rarely crack-shaped and have a preferred orientation, likely controlled by a pre-existing fabric in the sample. With on-going dehydration, pores coalesce into a single interconnected pore cluster that is connected to the surface of the sample cylinder and provides an effective drainage pathway. Our observations can be summarized in a model in which gypsum is stabilized by thermal expansion stresses and locally increased pore fluid pressures until the dehydration front approaches to within about 100 μm. Then, the internal stresses are released and dehydration happens efficiently, resulting in new pore space. Pressure release, the production of pores and the advance of the front are coupled in a feedback loop.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Controlled drug delivery is a key topic in modern pharmacotherapy, where controlled drug delivery devices are required to prolong the period of release, maintain a constant release rate, or release the drug with a predetermined release profile. In the pharmaceutical industry, the development process of a controlled drug delivery device may be facilitated enormously by the mathematical modelling of drug release mechanisms, directly decreasing the number of necessary experiments. Such mathematical modelling is difficult because several mechanisms are involved during the drug release process. The main drug release mechanisms of a controlled release device are based on the device’s physiochemical properties, and include diffusion, swelling and erosion. In this thesis, four controlled drug delivery models are investigated. These four models selectively involve the solvent penetration into the polymeric device, the swelling of the polymer, the polymer erosion and the drug diffusion out of the device but all share two common key features. The first is that the solvent penetration into the polymer causes the transition of the polymer from a glassy state into a rubbery state. The interface between the two states of the polymer is modelled as a moving boundary and the speed of this interface is governed by a kinetic law. The second feature is that drug diffusion only happens in the rubbery region of the polymer, with a nonlinear diffusion coefficient which is dependent on the concentration of solvent. These models are analysed by using both formal asymptotics and numerical computation, where front-fixing methods and the method of lines with finite difference approximations are used to solve these models numerically. This numerical scheme is conservative, accurate and easily implemented to the moving boundary problems and is thoroughly explained in Section 3.2. From the small time asymptotic analysis in Sections 5.3.1, 6.3.1 and 7.2.1, these models exhibit the non-Fickian behaviour referred to as Case II diffusion, and an initial constant rate of drug release which is appealing to the pharmaceutical industry because this indicates zeroorder release. The numerical results of the models qualitatively confirms the experimental behaviour identified in the literature. The knowledge obtained from investigating these models can help to develop more complex multi-layered drug delivery devices in order to achieve sophisticated drug release profiles. A multi-layer matrix tablet, which consists of a number of polymer layers designed to provide sustainable and constant drug release or bimodal drug release, is also discussed in this research. The moving boundary problem describing the solvent penetration into the polymer also arises in melting and freezing problems which have been modelled as the classical onephase Stefan problem. The classical one-phase Stefan problem has unrealistic singularities existed in the problem at the complete melting time. Hence we investigate the effect of including the kinetic undercooling to the melting problem and this problem is called the one-phase Stefan problem with kinetic undercooling. Interestingly we discover the unrealistic singularities existed in the classical one-phase Stefan problem at the complete melting time are regularised and also find out the small time behaviour of the one-phase Stefan problem with kinetic undercooling is different to the classical one-phase Stefan problem from the small time asymptotic analysis in Section 3.3. In the case of melting very small particles, it is known that surface tension effects are important. The effect of including the surface tension to the melting problem for nanoparticles (no kinetic undercooling) has been investigated in the past, however the one-phase Stefan problem with surface tension exhibits finite-time blow-up. Therefore we investigate the effect of including both the surface tension and kinetic undercooling to the melting problem for nanoparticles and find out the the solution continues to exist until complete melting. The investigation of including kinetic undercooling and surface tension to the melting problems reveals more insight into the regularisations of unphysical singularities in the classical one-phase Stefan problem. This investigation gives a better understanding of melting a particle, and contributes to the current body of knowledge related to melting and freezing due to heat conduction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

During the last several decades, the quality of natural resources and their services have been exposed to significant degradation from increased urban populations combined with the sprawl of settlements, development of transportation networks and industrial activities (Dorsey, 2003; Pauleit et al., 2005). As a result of this environmental degradation, a sustainable framework for urban development is required to provide the resilience of natural resources and ecosystems. Sustainable urban development refers to the management of cities with adequate infrastructure to support the needs of its population for the present and future generations as well as maintain the sustainability of its ecosystems (UNEP/IETC, 2002; Yigitcanlar, 2010). One of the important strategic approaches for planning sustainable cities is „ecological planning‟. Ecological planning is a multi-dimensional concept that aims to preserve biodiversity richness and ecosystem productivity through the sustainable management of natural resources (Barnes et al., 2005). As stated by Baldwin (1985, p.4), ecological planning is the initiation and operation of activities to direct and control the acquisition, transformation, disruption and disposal of resources in a manner capable of sustaining human activities with a minimum disruption of ecosystem processes. Therefore, ecological planning is a powerful method for creating sustainable urban ecosystems. In order to explore the city as an ecosystem and investigate the interaction between the urban ecosystem and human activities, a holistic urban ecosystem sustainability assessment approach is required. Urban ecosystem sustainability assessment serves as a tool that helps policy and decision-makers in improving their actions towards sustainable urban development. There are several methods used in urban ecosystem sustainability assessment among which sustainability indicators and composite indices are the most commonly used tools for assessing the progress towards sustainable land use and urban management. Currently, a variety of composite indices are available to measure the sustainability at the local, national and international levels. However, the main conclusion drawn from the literature review is that they are too broad to be applied to assess local and micro level sustainability and no benchmark value for most of the indicators exists due to limited data availability and non-comparable data across countries. Mayer (2008, p. 280) advocates that by stating "as different as the indices may seem, many of them incorporate the same underlying data because of the small number of available sustainability datasets". Mori and Christodoulou (2011) also argue that this relative evaluation and comparison brings along biased assessments, as data only exists for some entities, which also means excluding many nations from evaluation and comparison. Thus, there is a need for developing an accurate and comprehensive micro-level urban ecosystem sustainability assessment method. In order to develop such a model, it is practical to adopt an approach that uses a method to utilise indicators for collecting data, designate certain threshold values or ranges, perform a comparative sustainability assessment via indices at the micro-level, and aggregate these assessment findings to the local level. Hereby, through this approach and model, it is possible to produce sufficient and reliable data to enable comparison at the local level, and provide useful results to inform the local planning, conservation and development decision-making process to secure sustainable ecosystems and urban futures. To advance research in this area, this study investigated the environmental impacts of an existing urban context by using a composite index with an aim to identify the interaction between urban ecosystems and human activities in the context of environmental sustainability. In this respect, this study developed a new comprehensive urban ecosystem sustainability assessment tool entitled the „Micro-level Urban-ecosystem Sustainability IndeX‟ (MUSIX). The MUSIX model is an indicator-based indexing model that investigates the factors affecting urban sustainability in a local context. The model outputs provide local and micro-level sustainability reporting guidance to help policy-making concerning environmental issues. A multi-method research approach, which is based on both quantitative analysis and qualitative analysis, was employed in the construction of the MUSIX model. First, a qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. Afterwards, a quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. The MUSIX model was tested in four pilot study sites selected from the Gold Coast City, Queensland, Australia. The model results detected the sustainability performance of current urban settings referring to six main issues of urban development: (1) hydrology, (2) ecology, (3) pollution, (4) location, (5) design, and; (6) efficiency. For each category, a set of core indicators was assigned which are intended to: (1) benchmark the current situation, strengths and weaknesses, (2) evaluate the efficiency of implemented plans, and; (3) measure the progress towards sustainable development. While the indicator set of the model provided specific information about the environmental impacts in the area at the parcel scale, the composite index score provided general information about the sustainability of the area at the neighbourhood scale. Finally, in light of the model findings, integrated ecological planning strategies were developed to guide the preparation and assessment of development and local area plans in conjunction with the Gold Coast Planning Scheme, which establishes regulatory provisions to achieve ecological sustainability through the formulation of place codes, development codes, constraint codes and other assessment criteria that provide guidance for best practice development solutions. These relevant strategies can be summarised as follows: • Establishing hydrological conservation through sustainable stormwater management in order to preserve the Earth’s water cycle and aquatic ecosystems; • Providing ecological conservation through sustainable ecosystem management in order to protect biological diversity and maintain the integrity of natural ecosystems; • Improving environmental quality through developing pollution prevention regulations and policies in order to promote high quality water resources, clean air and enhanced ecosystem health; • Creating sustainable mobility and accessibility through designing better local services and walkable neighbourhoods in order to promote safe environments and healthy communities; • Sustainable design of urban environment through climate responsive design in order to increase the efficient use of solar energy to provide thermal comfort, and; • Use of renewable resources through creating efficient communities in order to provide long-term management of natural resources for the sustainability of future generations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

INTRODUCTION It is known that the vascular morphology and functionality are changed following closed soft tissue trauma (CSTT) [1], and bone fractures [2]. The disruption of blood vessels may lead to hypoxia and necrosis. Currently, most clinical methods for the diagnosis and monitoring of CSTT with or without bone fractures are primarily based on qualitative measures or practical experience, making the diagnosis subjective and inaccurate. There is evidence that CSTT and early vascular changes following the injury delay the soft tissue tissue and bone healing [3]. However, a precise qualitative and quantitative morphological assessment of vasculature changes after trauma is currently missing. In this research, we aim to establish a diagnostic framework to assess the 3D vascular morphological changes after standardized CSTT in a rat model qualitatively and quantitatively using contrast-enhanced micro-CT imaging. METHODS An impact device was used for the application of a controlled reproducible CSTT to the left thigh (Biceps Femoris) of anaesthetized male Wistar rats. After euthanizing the animals at 6 hours, 24 hours, 3 days, 7 days, or 14 days after trauma, CSTT was qualitatively evaluated by macroscopic visual observation of the skin and muscles. For visualization of the vasculature, the blood vessels of sacrificed rats were flushed with heparinised saline and then perfused with a radio-opaque contrast agent (Microfil, MV 122, Flowtech, USA) using an infusion pump. After allowing the contrast agent to polymerize overnight, both hind-limbs were dissected, and then the whole injured and contra-lateral control limbs were imaged using a micro-CT scanner (µCT 40, Scanco Medical, Switzerland) to evaluate the vascular morphological changes. Correlated biopsy samples were also taken from the CSTT region of both injured and control legs. The morphological parameters such as the vessel volume ratio (VV/TV), vessel diameter (V.D), spacing (V.Sp), number (V.N), connectivity (V.Conn) and the degree of anisotropy (DA) were then quantified by evaluating the scans of biopsy samples using the micro-CT imaging system. RESULTS AND DISCUSSION A qualitative evaluation of the CSTT has shown that the developed impact protocols were capable of producing a defined and reproducible injury within the region of interest (ROI), resulting in a large hematoma and moderate swelling in both lateral and medial sides of the injured legs. Also, the visualization of the vascular network using 3D images confirmed the ability to perfuse the large vessels and a majority of the microvasculature consistently (Figure 1). Quantification of the vascular morphology obtained from correlated biopsy samples has demonstrated that V.D and V.N and V.Sp were significantly higher in the injured legs 24 hours after impact in comparison with the control legs (p<0.05). The evaluation of the other time points is currently progressing. CONCLUSIONS The findings of this research will contribute to a better understanding of the changes to the vascular network architecture following traumatic injuries and during healing process. When interpreted in context of functional changes, such as tissue oxygenation, this will allow for objective diagnosis and monitoring of CSTT and serve as validation for future non-invasive clinical assessment modalities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article describes the first steps toward comprehensive characterization of molecular transport within scaffolds for tissue engineering. The scaffolds were fabricated using a novel melt electrospinning technique capable of constructing 3D lattices of layered polymer fibers with well - defined internal microarchitectures. The general morphology and structure order was then determined using T 2 - weighted magnetic resonance imaging and X - ray microcomputed tomography. Diffusion tensor microimaging was used to measure the time - dependent diffusivity and diffusion anisotropy within the scaffolds. The measured diffusion tensors were anisotropic and consistent with the cross - hatched geometry of the scaffolds: diffusion was least restricted in the direction perpendicular to the fiber layers. The results demonstrate that the cross - hatched scaffold structure preferentially promotes molecular transport vertically through the layers ( z - axis), with more restricted diffusion in the directions of the fiber layers ( x – y plane). Diffusivity in the x – y plane was observed to be invariant to the fiber thickness. The characteristic pore size of the fiber scaffolds can be probed by sampling the diffusion tensor at multiple diffusion times. Prospective application of diffusion tensor imaging for the real - time monitoring of tissue maturation and nutrient transport pathways within tissue engineering scaffolds is discussed.