922 resultados para Traditional enrichment method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a method to test the cytotoxicity of wound dressings, ointments, creams and gels used in our Burn Centre, by placing them on a permeable Nunc Polycarbonate cell culture insert, incubated with a monolayer of cells (HaCaTs and primary human keratinocytes). METHODS: We performed two different methods to determine the relative toxicity to cells. (1) Photo visualisation: The dressings or compounds were positioned on the insert's membrane which was placed onto the monolayer tissue culture plate. After 24 h the surviving adherent cells were stained with Toluidine Blue and photos of the plates were taken. The acellular area of non-adherent dead cells which had been washed off with buffer was measured as a percentage of the total area of the plate. (2) Cell count of surviving cells: After 24 h incubation with the test material, the remaining cells were detached with trypsin, spun down and counted in a Haemocytometer with Trypan Blue, which differentiates between live and dead cells. RESULTS: Seventeen products were tested. The least cytotoxic products were Melolite, White soft Paraffin and Chlorsig1% Ointment. Some cytotoxicity was shown with Jelonet, Mepitel((R)), PolyMem((R)), DuoDerm((R)) and Xeroform. The most cytotoxic products included those which contained silver or Chlorhexidine and Paraffin Cream a moisturizer which contains the preservative Chlorocresol. CONCLUSION: This in vitro cell culture insert method allows testing of agents without direct cell contact. It is easy and quick to perform, and should help the clinician to determine the relative cytotoxicity of various dressings and the optimal dressing for each individual wound.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is widely acknowledged that effective asset management requires an interdisciplinary approach, in which synergies should exist between traditional disciplines such as: accounting, engineering, finance, humanities, logistics, and information systems technologies. Asset management is also an important, yet complex business practice. Business process modelling is proposed as an approach to manage the complexity of asset management through the modelling of asset management processes. A sound foundation for the systematic application and analysis of business process modelling in asset management is, however, yet to be developed. Fundamentally, a business process consists of activities (termed functions), events/states, and control flow logic. As both events/states and control flow logic are somewhat dependent on the functions themselves, it is a logical step to first identify the functions within a process. This research addresses the current gap in knowledge by developing a method to identify functions common to various industry types (termed core functions). This lays the foundation to extract such functions, so as to identify both commonalities and variation points in asset management processes. This method describes the use of a manual text mining and a taxonomy approach. An example is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Olfactory ensheathing cells (OECs) play an important role in the continuous regeneration of the primary olfactory nervous system throughout life and for regeneration of olfactory neurons after injury. While it is known that several individual OEC subpopulations with distinct properties exist in different anatomical locations, it remains unclear how these different subpopulations respond to a major injury. We have examined the proliferation of OECs from one distinct location, the peripheral accessory olfactory nervous system, following large-scale injury (bulbectomy) in mice. We used crosses of two transgenic reporter mouse lines, S100ß-DsRed and OMP-ZsGreen, to visualise OECs, and main/accessory olfactory neurons, respectively. We surgically removed one olfactory bulb including the accessory olfactory bulb to induce degeneration, and found that accessory OECs in the nerve bundles that terminate in the accessory olfactory bulb responded by increased proliferation with a peak occurring 2 days after the injury. To label proliferating cells we used the thymidine analogue ethynyl deoxyuridine (EdU) using intranasal delivery instead of intraperitoneal injection. We compared and quantified the number of proliferating cells at different regions at one and four days after EdU labelling by the two different methods and found that intranasal delivery method was as effective as intrapeitoneal injection. We demonstrated that accessory OECs actively respond to widespread degeneration of accessory olfactory axons by proliferating. These results have important implications for selecting the source of OECs for neural regeneration therapies and show that intranasal delivery of EdU is an efficient and reliable method for assessing proliferation of olfactory glia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study extends the ‘zero scan’ method for CT imaging of polymer gel dosimeters to include multi-slice acquisitions. Multi slice CT images consisting of 24 slices of 1.2 mm thickness were acquired of an irradiated polymer gel dosimeter, and processed with the zero scan technique. The results demonstrate that zero scan based gel readout can be successfully applied to generate a three dimensional image of the irradiated gel field. Compared to the raw CT images the processed figures and cross gel profiles demonstrated reduced noise and clear visibility of the penumbral region. Moreover these improved results further highlight the suitability of this method in volumetric reconstruction with reduced CT data acquisition per slice. This work shows that 3D volumes of irradiated polymer gel dosimeters can be acquired and processed with x-ray CT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative analysis is increasingly being used in team sports to better understand performance in these stylized, delineated, complex social systems. Here we provide a first step toward understanding the pattern-forming dynamics that emerge from collective offensive and defensive behavior in team sports. We propose a novel method of analysis that captures how teams occupy sub-areas of the field as the ball changes location. We used the method to analyze a game of association football (soccer) based upon a hypothesis that local player numerical dominance is key to defensive stability and offensive opportunity. We found that the teams consistently allocated more players than their opponents in sub-areas of play closer to their own goal. This is consistent with a predominantly defensive strategy intended to prevent yielding even a single goal. We also find differences between the two teams' strategies: while both adopted the same distribution of defensive, midfield, and attacking players (a 4:3:3 system of play), one team was significantly more effective both in maintaining defensive and offensive numerical dominance for defensive stability and offensive opportunity. That team indeed won the match with an advantage of one goal (2 to 1) but the analysis shows the advantage in play was more pervasive than the single goal victory would indicate. Our focus on the local dynamics of team collective behavior is distinct from the traditional focus on individual player capability. It supports a broader view in which specific player abilities contribute within the context of the dynamics of multiplayer team coordination and coaching strategy. By applying this complex system analysis to association football, we can understand how players' and teams' strategies result in successful and unsuccessful relationships between teammates and opponents in the area of play.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrogen-doped TiO2 nanofibres of anatase and TiO2(B) phases were synthesised by a reaction between titanate nanofibres of a layered structure and gaseous NH3 at 400–700 °C, following a different mechanism than that for the direct nitrogen doping from TiO2. The surface of the N-doped TiO2 nanofibres can be tuned by facial calcination in air to remove the surface-bonded N species, whereas the core remains N doped. N-Doped TiO2 nanofibres, only after calcination in air, became effective photocatalysts for the decomposition of sulforhodamine B under visible-light irradiation. The surface-oxidised surface layer was proven to be very effective for organic molecule adsorption, and the activation of oxygen molecules, whereas the remaining N-doped interior of the fibres strongly absorbed visible light, resulting in the generation of electrons and holes. The N-doped nanofibres were also used as supports of gold nanoparticle (Au NP) photocatalysts for visible-light-driven hydroamination of phenylacetylene with aniline. Phenylacetylene was activated on the N-doped surface of the nanofibres and aniline on the Au NPs. The Au NPs adsorbed on N-doped TiO2(B) nanofibres exhibited much better conversion (80 % of phenylacetylene) than when adsorbed on undoped fibres (46 %) at 40 °C and 95 % of the product is the desired imine. The surface N species can prevent the adsorption of O2 that is unfavourable for the hydroamination reaction, and thus, improve the photocatalytic activity. Removal of the surface N species resulted in a sharp decrease of the photocatalytic activity. These photocatalysts are feasible for practical applications, because they can be easily dispersed into solution and separated from a liquid by filtration, sedimentation or centrifugation due to their fibril morphology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Largely as a result of mass unemployment problems in many European countries, the dynamics of job creation has in recent years attracted increased interest on the part of academics as well as policy-makers. In connection to this, a large number of studies carried out in various countries have concluded that SMEs play a very large and/or growing role as job creators (Birch, 1979; Baldwin and Picot, 1995; Davidsson, 1995a; Davidsson, Lindmark and Olofsson, 1993; 1994; 1995; 1997a; 1997b; Fumagelli and Mussati, 1993; Kirchhoff and Phillips, 1988; Spilling, 1995; for further reference to studies carried out in a large number of countries see also Aiginger and Tichy, 1991; ENSR, 1994; Loveman and Sengenberger, 1991; OECD, 1987; Storey and Johnson, 1987). While most researchers agree on the importance of SMEs, there is some controversy as regards whether this is mainly a result of many small start-ups and incremental expansions, or if a small minority of high growth SMEs contribute the lion’s share of new employment. This is known as the ‘mice vs. gazelles’ or ‘flyers vs. trundlers’ debate. Storey strongly advocates the position that the small group of high growth SMEs are the ‘real’ job creators (Storey, 1994; Storey & Johnson, 1987), whereas, e.g., the Davidsson et al research in Sweden (cf. above) gives more support for the ‘mice’ hypothesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The numerical solution in one space dimension of advection--reaction--diffusion systems with nonlinear source terms may invoke a high computational cost when the presently available methods are used. Numerous examples of finite volume schemes with high order spatial discretisations together with various techniques for the approximation of the advection term can be found in the literature. Almost all such techniques result in a nonlinear system of equations as a consequence of the finite volume discretisation especially when there are nonlinear source terms in the associated partial differential equation models. This work introduces a new technique that avoids having such nonlinear systems of equations generated by the spatial discretisation process when nonlinear source terms in the model equations can be expanded in positive powers of the dependent function of interest. The basis of this method is a new linearisation technique for the temporal integration of the nonlinear source terms as a supplementation of a more typical finite volume method. The resulting linear system of equations is shown to be both accurate and significantly faster than methods that necessitate the use of solvers for nonlinear system of equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The acceptance of broadband ultrasound attenuation for the assessment of osteoporosis suffers from a limited understanding of ultrasound wave propagation through cancellous bone. It has recently been proposed that the ultrasound wave propagation can be described by a concept of parallel sonic rays. This concept approximates the detected transmission signal to be the superposition of all sonic rays that travel directly from transmitting to receiving transducer. The transit time of each ray is defined by the proportion of bone and marrow propagated. An ultrasound transit time spectrum describes the proportion of sonic rays having a particular transit time, effectively describing lateral inhomogeneity of transit times over the surface of the receiving ultrasound transducer. The aim of this study was to provide a proof of concept that a transit time spectrum may be derived from digital deconvolution of input and output ultrasound signals. We have applied the active-set method deconvolution algorithm to determine the ultrasound transit time spectra in the three orthogonal directions of four cancellous bone replica samples and have compared experimental data with the prediction from the computer simulation. The agreement between experimental and predicted ultrasound transit time spectrum analyses derived from Bland–Altman analysis ranged from 92% to 99%, thereby supporting the concept of parallel sonic rays for ultrasound propagation in cancellous bone. In addition to further validation of the parallel sonic ray concept, this technique offers the opportunity to consider quantitative characterisation of the material and structural properties of cancellous bone, not previously available utilising ultrasound.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many interacting factors contribute to a student's choice of a university. This study takes a systems perspective of the choice and develops a Bayesian Network to represent and quantify these factors and their interactions. The systems model is illustrated through a small study of traditional school leavers in Australia, and highlights similarities and differences between universities' perceptions of student choices, students' perceptions of factors that they should consider and how students really make choices. The study shows the range of information that can be gained from this approach, including identification of important factors and scenario assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exact solutions of partial differential equation models describing the transport and decay of single and coupled multispecies problems can provide insight into the fate and transport of solutes in saturated aquifers. Most previous analytical solutions are based on integral transform techniques, meaning that the initial condition is restricted in the sense that the choice of initial condition has an important impact on whether or not the inverse transform can be calculated exactly. In this work we describe and implement a technique that produces exact solutions for single and multispecies reactive transport problems with more general, smooth initial conditions. We achieve this by using a different method to invert a Laplace transform which produces a power series solution. To demonstrate the utility of this technique, we apply it to two example problems with initial conditions that cannot be solved exactly using traditional transform techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a finite volume method to solve the time-space two-sided fractional advection-dispersion equation on a one-dimensional domain. The spatial discretisation employs fractionally-shifted Grünwald formulas to discretise the Riemann-Liouville fractional derivatives at control volume faces in terms of function values at the nodes. We demonstrate how the finite volume formulation provides a natural, convenient and accurate means of discretising this equation in conservative form, compared to using a conventional finite difference approach. Results of numerical experiments are presented to demonstrate the effectiveness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Detecting anomalies in the online social network is a significant task as it assists in revealing the useful and interesting information about the user behavior on the network. This paper proposes a rule-based hybrid method using graph theory, Fuzzy clustering and Fuzzy rules for modeling user relationships inherent in online-social-network and for identifying anomalies. Fuzzy C-Means clustering is used to cluster the data and Fuzzy inference engine is used to generate rules based on the cluster behavior. The proposed method is able to achieve improved accuracy for identifying anomalies in comparison to existing methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the problem of document ranking in a non-traditional retrieval task, called subtopic retrieval. This task involves promoting relevant documents that cover many subtopics of a query at early ranks, providing thus diversity within the ranking. In the past years, several approaches have been proposed to diversify retrieval results. These approaches can be classified into two main paradigms, depending upon how the ranks of documents are revised for promoting diversity. In the first approach subtopic diversification is achieved implicitly, by choosing documents that are different from each other, while in the second approach this is done explicitly, by estimating the subtopics covered by documents. Within this context, we compare methods belonging to the two paradigms. Furthermore, we investigate possible strategies for integrating the two paradigms with the aim of formulating a new ranking method for subtopic retrieval. We conduct a number of experiments to empirically validate and contrast the state-of-the-art approaches as well as instantiations of our integration approach. The results show that the integration approach outperforms state-of-the-art strategies with respect to a number of measures.