373 resultados para ACQUIRED TOXOPLASMOSIS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Road surface macro-texture is an indicator used to determine the skid resistance levels in pavements. Existing methods of quantifying macro-texture include the sand patch test and the laser profilometer. These methods utilise the 3D information of the pavement surface to extract the average texture depth. Recently, interest in image processing techniques as a quantifier of macro-texture has arisen, mainly using the Fast Fourier Transform (FFT). This paper reviews the FFT method, and then proposes two new methods, one using the autocorrelation function and the other using wavelets. The methods are tested on pictures obtained from a pavement surface extending more than 2km's. About 200 images were acquired from the surface at approx. 10m intervals from a height 80cm above ground. The results obtained from image analysis methods using the FFT, the autocorrelation function and wavelets are compared with sensor measured texture depth (SMTD) data obtained from the same paved surface. The results indicate that coefficients of determination (R2) exceeding 0.8 are obtained when up to 10% of outliers are removed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The enthesis of the plantar fascia is thought to play an important role in stress dissipation. However, the potential link between entheseal thickening characteristic of enthesopathy and the stress-dissipating properties of the intervening plantar fat pad have not been investigated. Purpose: This study was conducted to identify whether plantar fat pad mechanics explain variance in the thickness of the fascial enthesis in individuals with and without plantar enthesopathy. Study Design: Case-control study; Level of evidence, 3. Methods: The study population consisted of 9 patients with unilateral plantar enthesopathy and 9 asymptomatic, individually matched controls. The thickness of the enthesis of the symptomatic, asymptomatic, and a matched control limb was acquired using high-resolution ultrasound. The compressive strain of the plantar fat pad during walking was estimated from dynamic lateral radiographs acquired with a multifunction fluoroscopy unit. Peak compressive stress was simultaneously acquired via a pressure platform. Principal viscoelastic parameters were estimated from subsequent stress-strain curves. Results: The symptomatic fascial enthesis (6.7 ± 2.0 mm) was significantly thicker than the asymptomatic enthesis (4.2 ± 0.4 mm), which in turn was thicker than the enthesis (3.3 ± 0.4 mm) of control limbs (P < .05). There was no significant difference in the mean thickness, peak stress, peak strain, or secant modulus of the plantar fat pad between limbs. However, the energy dissipated by the fat pad during loading and unloading was significantly lower in the symptomatic limb (0.55 ± 0.17) when compared with asymptomatic (0.69 ± 0.13) and control (0.70 ± 0.09) limbs (P < .05). The sonographic thickness of the enthesis was correlated with the energy dissipation ratio of the plantar fat pad (r = .72, P < .05), but only in the symptomatic limb. Conclusion: The energy-dissipating properties of the plantar fat pad are associated with the sonograpic appearance of the enthesis in symptomatic limbs, providing a previously unidentified link between the mechanical behavior of the plantar fat pad and enthesopathy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automobiles have deeply impacted the way in which we travel but they have also contributed to many deaths and injury due to crashes. A number of reasons for these crashes have been pointed out by researchers. Inexperience has been identified as a contributing factor to road crashes. Driver’s driving abilities also play a vital role in judging the road environment and reacting in-time to avoid any possible collision. Therefore driver’s perceptual and motor skills remain the key factors impacting on road safety. Our failure to understand what is really important for learners, in terms of competent driving, is one of the many challenges for building better training programs. Driver training is one of the interventions aimed at decreasing the number of crashes that involve young drivers. Currently, there is a need to develop comprehensive driver evaluation system that benefits from the advances in Driver Assistance Systems. A multidisciplinary approach is necessary to explain how driving abilities evolves with on-road driving experience. To our knowledge, driver assistance systems have never been comprehensively used in a driver training context to assess the safety aspect of driving. The aim and novelty of this thesis is to develop and evaluate an Intelligent Driver Training System (IDTS) as an automated assessment tool that will help drivers and their trainers to comprehensively view complex driving manoeuvres and potentially provide effective feedback by post processing the data recorded during driving. This system is designed to help driver trainers to accurately evaluate driver performance and has the potential to provide valuable feedback to the drivers. Since driving is dependent on fuzzy inputs from the driver (i.e. approximate distance calculation from the other vehicles, approximate assumption of the other vehicle speed), it is necessary that the evaluation system is based on criteria and rules that handles uncertain and fuzzy characteristics of the driving tasks. Therefore, the proposed IDTS utilizes fuzzy set theory for the assessment of driver performance. The proposed research program focuses on integrating the multi-sensory information acquired from the vehicle, driver and environment to assess driving competencies. After information acquisition, the current research focuses on automated segmentation of the selected manoeuvres from the driving scenario. This leads to the creation of a model that determines a “competency” criterion through the driving performance protocol used by driver trainers (i.e. expert knowledge) to assess drivers. This is achieved by comprehensively evaluating and assessing the data stream acquired from multiple in-vehicle sensors using fuzzy rules and classifying the driving manoeuvres (i.e. overtake, lane change, T-crossing and turn) between low and high competency. The fuzzy rules use parameters such as following distance, gaze depth and scan area, distance with respect to lanes and excessive acceleration or braking during the manoeuvres to assess competency. These rules that identify driving competency were initially designed with the help of expert’s knowledge (i.e. driver trainers). In-order to fine tune these rules and the parameters that define these rules, a driving experiment was conducted to identify the empirical differences between novice and experienced drivers. The results from the driving experiment indicated that significant differences existed between novice and experienced driver, in terms of their gaze pattern and duration, speed, stop time at the T-crossing, lane keeping and the time spent in lanes while performing the selected manoeuvres. These differences were used to refine the fuzzy membership functions and rules that govern the assessments of the driving tasks. Next, this research focused on providing an integrated visual assessment interface to both driver trainers and their trainees. By providing a rich set of interactive graphical interfaces, displaying information about the driving tasks, Intelligent Driver Training System (IDTS) visualisation module has the potential to give empirical feedback to its users. Lastly, the validation of the IDTS system’s assessment was conducted by comparing IDTS objective assessments, for the driving experiment, with the subjective assessments of the driver trainers for particular manoeuvres. Results show that not only IDTS was able to match the subjective assessments made by driver trainers during the driving experiment but also identified some additional driving manoeuvres performed in low competency that were not identified by the driver trainers due to increased mental workload of trainers when assessing multiple variables that constitute driving. The validation of IDTS emphasized the need for an automated assessment tool that can segment the manoeuvres from the driving scenario, further investigate the variables within that manoeuvre to determine the manoeuvre’s competency and provide integrated visualisation regarding the manoeuvre to its users (i.e. trainers and trainees). Through analysis and validation it was shown that IDTS is a useful assistance tool for driver trainers to empirically assess and potentially provide feedback regarding the manoeuvres undertaken by the drivers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show that when a soft polymer like Poly(3-hexyl-thiophene) wraps multiwall nanotubes by coiling around the main axis, a localized deformation of the nanotube structure is observed. High resolution transmission electron microscopy shows that radial compressions of about 4% can take place, and could possibly lead to larger interlayer distance between the nanotube inner walls and reduce the innermost nanotube radius. The mechanical stress due to the polymer presence was confirmed by Raman spectroscopic observation of a gradual upshift of the carbon nanotube G-band when the polymer content in the composites was progressively increased. Vibrational spectroscopy also indicates that charge transfer from the polymer to the nanotubes is responsible for a peak frequency relative downshift for high P3HT-content samples. Continuously acquired transmission electron microscopy images at rising temperature show the MWCNT elastic compression and relaxation due to polymer rearrangement on the nanotube surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cell based therapies as they apply to tissue engineering and regenerative medicine, require cells capable of self renewal and differentiation, and a prerequisite is to be able to prepare an effective dose of ex vivo expanded cells for autologous transplants. The in vivo identification of a source of physiologically relevant cell types suitable for cell therapies therefore figures as an integral part of tissue engineering. Stem cells serve as a reserve for biological repair, having the potential to differentiate into a number of specialised cell types within the body; they therefore represent the most useful candidates for cell based therapies. The primary goal of stem cell research is to produce cells that are both patient specific, as well as having properties suitable for the specific conditions for which they are intended to remedy. From a purely scientific perspective, stem cells allow scientists to gain a deeper understanding of developmental biology and regenerative therapies. Stem cells have acquired a number of uses for applications in regenerative medicine, immunotherapy, gene therapy, but it is in the area of tissue engineering that they generate most excitement, primarily as a result of their capacity for self-renewal and pluripotency. A unique feature of stem cells is their ability to maintain an uncommitted quiescent state in vivo and then, once triggered by conditions such as disease, injury or natural wear or tear, serve as a reservoir and natural support system to replenish lost cells. Although these cells retain the plasticity to differentiate into various tissues, being able to control this differentiation process is still one of the biggest challenges facing stem cell research. In an effort to harness the potential of these cells a number of studies have been conducted using both embryonic/foetal and adult stem cells. The use of embryonic stem cells (ESC) have been hampered by strong ethical and political concerns, this despite their perceived versatility due to their pluripotency. Ethical issues aside, other concerns raised with ESCs relates to the possibility of tumorigenesis, immune rejection and complications with immunosuppressive therapies, all of which adds layers of complications to the application ESC in research and which has led to the search for alternative sources for stem cells. The adult tissues in higher organisms harbours cells, termed adult stem cells, and these cells are reminiscent of unprogrammed stem cells. A number of sources of adult stem cells have been described. Bone marrow is by far the most accessible source of two potent populations of adult stem cells, namely haematopoietic stem cells (HSCs) and bone marrow mesenchymal stem cells (BMSCs). Autologously harvested adult stem cells can, in contrast to embryonic stem cells, readily be used in autografts, since immune rejection is not an issue; and their use in scientific research has not attracted the ethical concerns which have been the case with embryonic stem cells. The major limitation to their use, however, is the fact that adult stem cells are exceedingly rare in most tissues. This fact makes identifying and isolating these cells problematic; bone marrow being perhaps the only notable exception. Unlike the case of HSCs, there are as yet no rigorous criteria for characterizing MSCs. Changing acuity about the pluripotency of MSCs in recent studies has expanded their potential application; however, the underlying molecular pathways which impart the features distinctive to MSCs remain elusive. Furthermore, the sparse in vivo distribution of these cells imposes a clear limitation to their study in vitro. Also, when MSCs are cultured in vitro, there is a loss of the in vivo microenvironment, resulting in a progressive decline in proliferation potential and multipotentiality. This is further exacerbated with increased passage numbers in culture, characterized by the onset of senescence related changes. As a consequence, it is necessary to establish protocols for generating large numbers of MSCs but without affecting their differentiation potential. MSCs are capable of differentiating into mesenchymal tissue lineages, including bone, cartilage, fat, tendon, muscle, and marrow stroma. Recent findings indicate that adult bone marrow may also contain cells that can differentiate into the mature, nonhematopoietic cells of a number of tissues, including cells of the liver, kidney, lung, skin, gastrointestinal tract, and myocytes of heart and skeletal muscle. MSCs can readily be expanded in vitro and can be genetically modified by viral vectors and be induced to differentiate into specific cell lineages by changing the microenvironment–properties which makes these cells ideal vehicles for cellular gene therapy. MSCs can also exert profound immunosuppressive effects via modulation of both cellular and innate immune pathways, and this property allows them to overcome the issue of immune rejection. Despite the many attractive features associated with MSCs, there are still many hurdles to overcome before these cells are readily available for use in clinical applications. The main concern relates to in vivo characterization and identification of MSCs. The lack of a universal biomarker, sparse in vivo distribution, and a steady age related decline in their numbers, makes it an obvious need to decipher the reprogramming pathways and critical molecular players which govern the characteristics unique to MSCs. This book presents a comprehensive insight into the biology of adult stem cells and their utility in current regeneration therapies. The adult stem cell populations reviewed in this book include bone marrow derived MSCs, adipose derived stem cells (ASCs), umbilical cord blood stem cells, and placental stem cells. The features such as MSC circulation and trafficking, neuroprotective properties, and the nurturing roles and differentiation potential of multiple lineages have been discussed in details. In terms of therapeutic applications, the strengths of MSCs have been presented and their roles in disease treatments such as osteoarthritis, Huntington’s disease, periodontal regeneration, and pancreatic islet transplantation have been discussed. An analysis comparing osteoblast differentiation of umbilical cord blood stem cells and MSCs has been reviewed, as has a comparison of human placental stem cells and ASCs, in terms of isolation, identification and therapeutic applications of ASC in bone, cartilage regeneration, as well as myocardial regeneration. It is my sincere hope that this book will update the reader as to the research progress of MSC biology and potential use of these cells in clinical applications. It will be the best reward to all contributors of this book, if their efforts herein may in some way help the readers in any part of their study, research, and career development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Academic libraries have been acquiring ebooks for their collections for a number of years, but the uptake by some users has been curtailed by the limitations of screen reading on a traditional PC or laptop. Ebook readers promise to take us into a new phase of ebook development. Innovations include: wireless connectivity, electronic paper, increased battery life, and customisable displays. The recent rapid take-up of ebook readers in the United States, particularly Amazon’s Kindle, suggests that they may about to gain mass-market acceptance. A bewildering number of ebook readers are being promoted by companies eager to gain market share. In addition, each month seems to bring a new ebook reader or a new model of an existing device. Library administrators are faced with the challenge of separating the hype from the reality and deciding when the time is right to invest in and support these new technologies. The Library at QUT, in particular the QUT Library Ebook Reference Group (ERG) has been closely following developments in ebooks and ebook reader technologies. During mid 2010 QUT Library undertook a trial of a range of ebook readers available to Australian consumers. Each of the ebook readers acquired was evaluated by members of the QUT Library ERG and two student focus groups. Major criteria for evaluation included usability, functionality, accessibility and compatibility with QUT Library’s existing ebook collection. The two student focus groups evaluated the ebook readers mostly according to the criteria of usability and functionality. This paper will discuss these evaluations and outline a recommendation about which type (or types) of ebook readers could be acquired and made available for lending to students.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the motion characteristics of on-site objects is desirable for the analysis of construction work zones, especially in problems related to safety and productivity studies. This article presents a methodology for rapid object identification and tracking. The proposed methodology contains algorithms for spatial modeling and image matching. A high-frame-rate range sensor was utilized for spatial data acquisition. The experimental results indicated that an occupancy grid spatial modeling algorithm could quickly build a suitable work zone model from the acquired data. The results also showed that an image matching algorithm is able to find the most similar object from a model database and from spatial models obtained from previous scans. It is then possible to use the matched information to successfully identify and track objects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The new configuration proposed in this paper for Marx Generator (MG) aims to generate high voltage for pulsed power applications through reduced number of semiconductor components with a more efficient load supplying process. The main idea is to charge two groups of capacitors in parallel through an inductor and take advantage of resonant phenomenon in charging each capacitor up to a double input voltage level. In each resonant half a cycle, one of those capacitor groups are charged, and eventually the charged capacitors will be connected in series and the summation of the capacitor voltages can be appeared at the output of the topology. This topology can be considered as a modified Marx generator which works based on the resonant concept. Simulated models of this converter have been investigated in Matlab/SIMULINK platform and a prototype set up has been implemented in laboratory. The acquired results of either fully satisfy the anticipations in proper operation of the converter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The new configuration proposed in this paper for Marx Generator (MG.) aims to generate high voltage for pulsed power applications through reduced number of semiconductor components with a more efficient load supplying process. The main idea is to charge two groups of capacitors in parallel through an inductor and take the advantage of resonant phenomenon in charging each capacitor up to a double input voltage level. In each resonant half a cycle, one of those capacitor groups are charged, and eventually the charged capacitors will be connected in series and the summation of the capacitor voltages can be appeared at the output of the topology. This topology can be considered as a modified Marx generator which works based on the resonant concept. Simulated models of this converter have been investigated in Matlab/SIMULINK platform and the acquired results fully satisfy the anticipations in proper operation of the converter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Geant4 based simulation tool has been developed to perform Monte Carlo modelling of a 6 MV VarianTM iX clinac. The computer aided design interface of Geant4 was used to accurately model the LINAC components, including the Millenium multi-leaf collimators (MLCs). The simulation tool was verified via simulation of standard commissioning dosimetry data acquired with an ionisation chamber in a water phantom. Verification of the MLC model was achieved by simulation of leaf leakage measurements performed using GafchromicTM film in a solid water phantom. An absolute dose calibration capability was added by including a virtual monitor chamber into the simulation. Furthermore, a DICOM-RT interface was integrated with the application to allow the simulation of treatment plans in radiotherapy. The ability of the simulation tool to accurately model leaf movements and doses at each control point was verified by simulation of a widely used intensity-modulated radiation therapy (IMRT) quality assurance (QA) technique, the chair test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During a three-month stint at a production company, Alan McKee discovered that some of the knowledge required to work in television can only be acquired through practical experience. Here he offers some tips to help students successfully transition into the industry

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human hair fibres are ubiquitous in nature and are found frequently at crime scenes often as a result of exchange between the perpetrator, victim and/or the surroundings according to Locard's Principle. Therefore, hair fibre evidence can provide important information for crime investigation. For human hair evidence, the current forensic methods of analysis rely on comparisons of either hair morphology by microscopic examination or nuclear and mitochondrial DNA analyses. Unfortunately in some instances the utilisation of microscopy and DNA analyses are difficult and often not feasible. This dissertation is arguably the first comprehensive investigation aimed to compare, classify and identify the single human scalp hair fibres with the aid of FTIR-ATR spectroscopy in a forensic context. Spectra were collected from the hair of 66 subjects of Asian, Caucasian and African (i.e. African-type). The fibres ranged from untreated to variously mildly and heavily cosmetically treated hairs. The collected spectra reflected the physical and chemical nature of a hair from the near-surface particularly, the cuticle layer. In total, 550 spectra were acquired and processed to construct a relatively large database. To assist with the interpretation of the complex spectra from various types of human hair, Derivative Spectroscopy and Chemometric methods such as Principal Component Analysis (PCA), Fuzzy Clustering (FC) and Multi-Criteria Decision Making (MCDM) program; Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and Geometrical Analysis for Interactive Aid (GAIA); were utilised. FTIR-ATR spectroscopy had two important advantages over to previous methods: (i) sample throughput and spectral collection were significantly improved (no physical flattening or microscope manipulations), and (ii) given the recent advances in FTIR-ATR instrument portability, there is real potential to transfer this work.s findings seamlessly to on-field applications. The "raw" spectra, spectral subtractions and second derivative spectra were compared to demonstrate the subtle differences in human hair. SEM images were used as corroborative evidence to demonstrate the surface topography of hair. It indicated that the condition of the cuticle surface could be of three types: untreated, mildly treated and treated hair. Extensive studies of potential spectral band regions responsible for matching and discrimination of various types of hair samples suggested the 1690-1500 cm-1 IR spectral region was to be preferred in comparison with the commonly used 1750-800 cm-1. The principal reason was the presence of the highly variable spectral profiles of cystine oxidation products (1200-1000 cm-1), which contributed significantly to spectral scatter and hence, poor hair sample matching. In the preferred 1690-1500 cm-1 region, conformational changes in the keratin protein attributed to the α-helical to β-sheet transitions in the Amide I and Amide II vibrations and played a significant role in matching and discrimination of the spectra and hence, the hair fibre samples. For gender comparison, the Amide II band is significant for differentiation. The results illustrated that the male hair spectra exhibit a more intense β-sheet vibration in the Amide II band at approximately 1511 cm-1 whilst the female hair spectra displayed more intense α-helical vibration at 1520-1515cm-1. In terms of chemical composition, female hair spectra exhibit greater intensity of the amino acid tryptophan (1554 cm-1), aspartic and glutamic acid (1577 cm-1). It was also observed that for the separation of samples based on racial differences, untreated Caucasian hair was discriminated from Asian hair as a result of having higher levels of the amino acid cystine and cysteic acid. However, when mildly or chemically treated, Asian and Caucasian hair fibres are similar, whereas African-type hair fibres are different. In terms of the investigation's novel contribution to the field of forensic science, it has allowed for the development of a novel, multifaceted, methodical protocol where previously none had existed. The protocol is a systematic method to rapidly investigate unknown or questioned single human hair FTIR-ATR spectra from different genders and racial origin, including fibres of different cosmetic treatments. Unknown or questioned spectra are first separated on the basis of chemical treatment i.e. untreated, mildly treated or chemically treated, genders, and racial origin i.e. Asian, Caucasian and African-type. The methodology has the potential to complement the current forensic analysis methods of fibre evidence (i.e. Microscopy and DNA), providing information on the morphological, genetic and structural levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In asset intensive industries such as mining, oil & gas, utilities etc. most of the capital expenditure happens on acquiring engineering assets. Process of acquiring assets is called as “Procurement” or “Acquisition”. An asset procurement decision should be taken in consideration with the installation, commissioning, operational, maintenance and disposal needs of an asset or spare. However, such cross-functional collaboration and communication does not appear to happen between engineering, maintenance, warehousing and procurement functions in many asset intensive industries. Acquisition planning and execution are two distinct parts of asset acquisition process. Acquisition planning or procurement planning is responsible for determining exactly what is required to be purchased. It is important that an asset acquisition decision is the result of cross-functional decision making process. An acquisition decision leads to a formal purchase order. Most costly asset decisions occur even before they are acquired. Therefore, acquisition decision should be an outcome of an integrated planning & decision making process. Asset intensive organizations both, Government and non Government in Australia spent AUD 102.5 Billion on asset acquisition in year 2008-09. There is widespread evidence of many assets and spare not being used or utilized and in the end are written off. This clearly shows that many organizations end up buying assets or spares which were not required or non-conforming to the needs of user functions. It is due the fact that strategic and software driven procurement process do not consider all the requirements from various functions within the organization which contribute to the operation and maintenance of the asset over its life cycle. There is a lot of research done on how to implement an effective procurement process. There are numerous software solutions available for executing a procurement process. However, not much research is done on how to arrive at a cross functional procurement planning process. It is also important to link procurement planning process to procurement execution process. This research will discuss ““Acquisition Engineering Model” (AEM) framework, which aims at assisting acquisition decision making based on various criteria to satisfy cross-functional organizational requirements. Acquisition Engineering Model (AEM) will consider inputs from corporate asset management strategy, production management, maintenance management, warehousing, finance and HSE. Therefore, it is essential that the multi-criteria driven acquisition planning process is carried out and its output is fed to the asset acquisition (procurement execution) process. An effective procurement decision making framework to perform acquisition planning which considers various functional criteria will be discussed in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This short film, created by David Megarrity and Luke Monsour, experimented within a short timeframe with the challenge of superimposition of hand-drawn backgrounds, non-verbal action, and a short, sharp shoot. The aim was also to find a single piece of standalone music that would act as an unedited soundtrack It won Best Queensland Film at the Woodford Film Festival in 2005, and was screened at Base-Court, Lausanne Switzerland in 2006, and the Westgarth Film Festival 2005. It was acquired by comedy website minimovie in 2007.