987 resultados para soft-commutation techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

High energy bone fractures resulting from impact trauma are often accompanied by subcutaneous soft tissue injuries, even if the skin remains intact. There is evidence that such closed soft tissue injuries affect the healing of bone fractures, and vice versa. Despite this knowledge, most impact trauma studies in animals have focussed on bone fractures or soft tissue trauma in isolation. However, given the simultaneous impact on both tissues a better understanding of the interaction between these two injuries is necessary to optimise clinical treatment. The aim of this study was therefore to develop a new experimental model and characterise, for the first time, the healing of a complex fracture with concurrent closed soft tissue trauma in sheep. A pendulum impact device was designed to deliver a defined and standardised impact to the distal thigh of sheep, causing a reproducible contusion injury to the subcutaneous soft tissues. In a subsequent procedure, a reproducible femoral butterfly fracture (AO C3-type) was created at the sheep’s femur, which was initially stabilised for 5 days by an external fixator construct to allow for soft tissue swelling to recede, and ultimately in a bridging construct using locking plates. The combined injuries were applied to twelve sheep and the healing observed for four or eight weeks (six animals per group) until sacrifice. The pendulum impact led to a moderate to severe circumferential soft tissue injury with significant bruising, haematomas and partial muscle disruptions. Posttraumatic measurements showed elevated intra-compartmental pressure and circulatory tissue breakdown markers, with recovery to normal, pre-injury values within four days. Clinically, no neurovascular deficiencies were observed. Bi-weekly radiological analysis of the healing fractures showed progressive callus healing over time, with the average number of callus bridges increasing from 0.4 at two weeks to 4.2 at eight weeks. Biomechanical testing after sacrifice showed increasing torsional stiffness between four and eight weeks healing time from 10% to 100%, and increasing ultimate torsional strength from 10% to 64% (relative to the contralateral control limb). Our results demonstrate the robust healing of a complex femur fracture in the presence of a severe soft tissue contusion injury in sheep and demonstrate the establishment of a clinically relevant experimental model, for research aimed at improving the treatment of bone fractures accompanied by closed soft tissue injuries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of mathematical models investigating certain aspects of the complicated process of wound healing are reported in the literature in recent years. However, effective numerical methods and supporting error analysis for the fractional equations which describe the process of wound healing are still limited. In this paper, we consider the numerical simulation of a fractional mathematical model of epidermal wound healing (FMM-EWH), which is based on the coupled advection-diffusion equations for cell and chemical concentration in a polar coordinate system. The space fractional derivatives are defined in the Left and Right Riemann-Liouville sense. Fractional orders in the advection and diffusion terms belong to the intervals (0, 1) or (1, 2], respectively. Some numerical techniques will be used. Firstly, the coupled advection-diffusion equations are decoupled to a single space fractional advection-diffusion equation in a polar coordinate system. Secondly, we propose a new implicit difference method for simulating this equation by using the equivalent of Riemann-Liouville and Grünwald-Letnikov fractional derivative definitions. Thirdly, its stability and convergence are discussed, respectively. Finally, some numerical results are given to demonstrate the theoretical analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter provides researchers with a guide to some of the types of dating techniques that can be used in geomorpological investigations and issues that need to be addressed when using gechronological data, specifically issues relating to accuracy and precision. This chapter also introduces the 'types' of dating methods that are commonly used in geomorphological studies. This includes sidereal, isotopic, radiogenic, and chemical dating methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective digital human model (DHM) simulation of automotive driver packaging ergonomics, safety and comfort depends on accurate modelling of occupant posture, which is strongly related to the mechanical interaction between human body soft tissue and flexible seat components. This paper presents a finite-element study simulating the deflection of seat cushion foam and supportive seat structures, as well as human buttock and thigh soft tissue when seated. The three-dimensional data used for modelling thigh and buttock geometry were taken on one 95th percentile male subject, representing the bivariate percentiles of the combined hip breadth (seated) and buttock-to-knee length distributions of a selected Australian and US population. A thigh-buttock surface shell based on this data was generated for the analytic model. A 6mm neoprene layer was offset from the shell to account for the compression of body tissue expected through sitting in a seat. The thigh-buttock model is therefore made of two layers, covering thin to moderate thigh and buttock proportions, but not more fleshy sizes. To replicate the effects of skin and fat, the neoprene rubber layer was modelled as a hyperelastic material with viscoelastic behaviour in a Neo-Hookean material model. Finite element (FE) analysis was performed in ANSYS V13 WB (Canonsburg, USA). It is hypothesized that the presented FE simulation delivers a valid result, compared to a standard SAE physical test and the real phenomenon of human-seat indentation. The analytical model is based on the CAD assembly of a Ford Territory seat. The optimized seat frame, suspension and foam pad CAD data were transformed and meshed into FE models and indented by the two layer, soft surface human FE model. Converging results with the least computational effort were achieved for a bonded connection between cushion and seat base as well as cushion and suspension, no separation between neoprene and indenter shell and a frictional connection between cushion pad and neoprene. The result is compared to a previous simulation of an indentation with a hard shell human finite-element model of equal geometry, and to the physical indentation result, which is approached with very high fidelity. We conclude that (a) SAE composite buttock form indentation of a suspended seat cushion can be validly simulated in a FE model of merely similar geometry, but using a two-layer hard/soft structure. (b) Human-seat indentation of a suspended seat cushion can be validly simulated with a simplified human buttock-thigh model for a selected anthropomorphism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The feral pig, Sus scrofa, is a widespread and abundant invasive species in Australia. Feral pigs pose a significant threat to the environment, agricultural industry, and human health, and in far north Queensland they endanger World Heritage values of the Wet Tropics. Historical records document the first introduction of domestic pigs into Australia via European settlers in 1788 and subsequent introductions from Asia from 1827 onwards. Since this time, domestic pigs have been accidentally and deliberately released into the wild and significant feral pig populations have become established, resulting in the declaration of this species as a class 2 pest in Queensland. The overall objective of this study was to assess the population genetic structure of feral pigs in far north Queensland, in particular to enable delineation of demographically independent management units. The identification of ecologically meaningful management units using molecular techniques can assist in targeting feral pig control to bring about effective long-term management. Molecular genetic analysis was undertaken on 434 feral pigs from 35 localities between Tully and Innisfail. Seven polymorphic and unlinked microsatellite loci were screened and fixation indices (FST and analogues) and Bayesian clustering methods were used to identify population structure and management units in the study area. Sequencing of the hyper-variable mitochondrial control region (D-loop) of 35 feral pigs was also examined to identify pig ancestry. Three management units were identified in the study at a scale of 25 to 35 km. Even with the strong pattern of genetic structure identified in the study area, some evidence of long distance dispersal and/or translocation was found as a small number of individuals exhibited ancestry from a management unit outside of which they were sampled. Overall, gene flow in the study area was found to be influenced by environmental features such as topography and land use, but no distinct or obvious natural or anthropogenic geographic barriers were identified. Furthermore, strong evidence was found for non-random mating between pigs of European and Asian breeds indicating that feral pig ancestry influences their population genetic structure. Phylogenetic analysis revealed two distinct mitochondrial DNA clades, representing Asian domestic pig breeds and European breeds. A significant finding was that pigs of Asian origin living in Innisfail and south Tully were not mating randomly with European breed pigs populating the nearby Mission Beach area. Feral pig control should be implemented in each of the management units identified in this study. The control should be coordinated across properties within each management unit to prevent re-colonisation from adjacent localities. The adjacent rainforest and National Park Estates, as well as the rainforest-crop boundary should be included in a simultaneous control operation for greater success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This book examines different aspects of Asian popular culture, including films, TV, music, comedy, folklore, cultural icons, the Internet and theme parks. It raises important questions such as – What are the implications of popularity of Asian popular culture for globalization? Do regional forces impede the globalizing of cultures? Or does the Asian popular culture flow act as a catalyst or conveying channel for cultural globalization? Does the globalization of culture pose a threat to local culture? It addresses two seemingly contradictory and yet parallel processes in the circulation of Asian popular culture: the interconnectedness between Asian popular culture and western culture in an era of cultural globalization that turns subjects such as Pokémon, Hip Hop or Cosmopolitan into truly global phenomena, and the local derivatives and versions of global culture that are necessarily disconnected from their origins in order to cater for the local market. It thereby presents a collective argument that, whilst local social formations, and patterns of consumption and participation in Asia are still very much dependent on global cultural developments and the phenomena of modernity, yet such dependence is often concretized, reshaped and distorted by the local media to cater for the local market. Contents: Introduction: Asian Popular Culture: The Global (Dis)continuity Anthony Y.H. Fung Part 1: The Dominance of Global Continuity: Cultural Localization and Adaptation 1. One Region, Two Modernities: Disneyland in Tokyo and Hong Kong Micky Lee and Anthony Y.H. Fung 2. Comic Travels: Disney Publishing in the People’s Republic of China Jennifer Altehenger 3. When Chinese Youth Meet Harry Potter: Translating Consumption and Middle Class Identification John Nguyet Erni 4.New Forms of Transborder Visuality in Urban China: Saving Face for Magazine Covers Eric Kit-Wai Ma 5. Cultural Consumption and Masculinity: A Case Study of GQ Magazine Covers in Taiwan Hong-Chi Shiau Part 2: Global Discontinuity: The Local Absorption of Global Culture 6. An Unlocalized and Unglobalized Subculture: English Language Independent Music in Singapore Kai Khiun Liew and Shzr Ee Tan 7. The Localized Production of Jamaican Music in Thailand Viriya Sawangchot 8. Consuming Online Games in Taiwan: Global Games and Local Market Lai-Chi Chen 9. The Rise of the Korean Cinema in Inbound and Outbound Globalization Shin Dong Kim Part 3: Cultural Domestication: A New Form of Global Continuity 10. Pocket Capitalism and Virtual Intimacy: Pokémon as a Symptom of Post-Industrial Youth Culture Anne Allison 11. Playing the Global Game: Japan Brand and Globalization Kukhee Choo Part 4: China as a Rising Market: Cultural Antagonism and Globalization 12. China’s New Creative Strategy: The Utilization of Cultural Soft Power and New Markets Michael Keane and Bonnie Liu 13. Renationalizing Hong Kong Cinema: The Gathering Force of the Mainland Market Michael Curtin

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increased adoption of business process management approaches, tools and practices, has led organizations to accumulate large collections of business process models. These collections can easily include hundred to thousand models, especially in the context of multinational corporations or as a result of organizational mergers and acquisitions. A concrete problem is thus how to maintain these large repositories in such a way that their complexity does not hamper their practical usefulness as a means to describe and communicate business operations. This paper proposes a technique to automatically infer suitable names for business process models and fragments thereof. This technique is useful for model abstraction scenarios, as for instance when user-specific views of a repository are required, or as part of a refactoring initiative aimed to simplify the repository’s complexity. The technique is grounded in an adaptation of the theory of meaning to the realm of business process models. We implemented the technique in a prototype tool and conducted an extensive evaluation using three process model collections from practice and a case study involving process modelers with different experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poor health and injury represent major obstacles to the future economic security of Australia. The national economic cost of work-related injury is estimated at $57.5 billion p/a. Since exposure to high physical demands is a major risk factor for musculoskeletal injury, monitoring and managing such physical activity levels in workers is a potentially important injury prevention strategy. Current injury monitoring practices are inadequate for the provision of clinically valuable information about the tissue specific responses to physical exertion. Injury of various soft tissue structures can manifest over time through accumulation of micro-trauma. Such micro-trauma has a propensity to increase the risk of acute injuries to soft-tissue structures such as muscle or tendon. As such, the capacity to monitor biomarkers that result from the disruption of these tissues offers a means of assisting the pre-emptive management of subclinical injury prior to acute failure or for evaluation of recovery processes. Here we have adopted an in-vivo exercise induced muscle damage model allowing the application of laboratory controlled conditions to assist in uncovering biochemical indicators associated with soft-tissue trauma and recovery. Importantly, urine was utilised as the diagnostic medium since it is non-invasive to collect, more acceptable to workers and less costly to employers. Moreover, it is our hypothesis that exercise induced tissue degradation products enter the circulation and are subsequently filtered by the kidney and pass through to the urine. To test this hypothesis a range of metabolomic and proteomic discovery-phase techniques were used, along with targeted approaches. Several small molecules relating to tissue damage were identified along with a series of skeletal muscle-specific protein fragments resulting from exercise induced soft-tissue damage. Each of the potential biomolecular markers appeared to be temporally present within urine. Moreover, the regulation of abundance seemed to be associated with functional recovery following the injury. This discovery may have important clinical applications for monitoring of a variety of inflammatory myopathies as well as novel applications in monitoring of the musculoskeletal health status of workers, professional athletes and/or military personnel to reduce the onset of potentially debilitating musculoskeletal injuries within these professions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genomic DNA obtained from patient whole blood samples is a key element for genomic research. Advantages and disadvantages, in terms of time-efficiency, cost-effectiveness and laboratory requirements, of procedures available to isolate nucleic acids need to be considered before choosing any particular method. These characteristics have not been fully evaluated for some laboratory techniques, such as the salting out method for DNA extraction, which has been excluded from comparison in different studies published to date. We compared three different protocols (a traditional salting out method, a modified salting out method and a commercially available kit method) to determine the most cost-effective and time-efficient method to extract DNA. We extracted genomic DNA from whole blood samples obtained from breast cancer patient volunteers and compared the results of the product obtained in terms of quantity (concentration of DNA extracted and DNA obtained per ml of blood used) and quality (260/280 ratio and polymerase chain reaction product amplification) of the obtained yield. On average, all three methods showed no statistically significant differences between the final result, but when we accounted for time and cost derived for each method, they showed very significant differences. The modified salting out method resulted in a seven- and twofold reduction in cost compared to the commercial kit and traditional salting out method, respectively and reduced time from 3 days to 1 hour compared to the traditional salting out method. This highlights a modified salting out method as a suitable choice to be used in laboratories and research centres, particularly when dealing with a large number of samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Results of an interlaboratory comparison on size characterization of SiO2 airborne nanoparticles using on-line and off-line measurement techniques are discussed. This study was performed in the framework of Technical Working Area (TWA) 34—“Properties of Nanoparticle Populations” of the Versailles Project on Advanced Materials and Standards (VAMAS) in the project no. 3 “Techniques for characterizing size distribution of airborne nanoparticles”. Two types of nano-aerosols, consisting of (1) one population of nanoparticles with a mean diameter between 30.3 and 39.0 nm and (2) two populations of non-agglomerated nanoparticles with mean diameters between, respectively, 36.2–46.6 nm and 80.2–89.8 nm, were generated for characterization measurements. Scanning mobility particle size spectrometers (SMPS) were used for on-line measurements of size distributions of the produced nano-aerosols. Transmission electron microscopy, scanning electron microscopy, and atomic force microscopy were used as off-line measurement techniques for nanoparticles characterization. Samples were deposited on appropriate supports such as grids, filters, and mica plates by electrostatic precipitation and a filtration technique using SMPS controlled generation upstream. The results of the main size distribution parameters (mean and mode diameters), obtained from several laboratories, were compared based on metrological approaches including metrological traceability, calibration, and evaluation of the measurement uncertainty. Internationally harmonized measurement procedures for airborne SiO2 nanoparticles characterization are proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significant amount of speech is typically required for speaker verification system development and evaluation, especially in the presence of large intersession variability. This paper introduces a source and utterance duration normalized linear discriminant analysis (SUN-LDA) approaches to compensate session variability in short-utterance i-vector speaker verification systems. Two variations of SUN-LDA are proposed where normalization techniques are used to capture source variation from both short and full-length development i-vectors, one based upon pooling (SUN-LDA-pooled) and the other on concatenation (SUN-LDA-concat) across the duration and source-dependent session variation. Both the SUN-LDA-pooled and SUN-LDA-concat techniques are shown to provide improvement over traditional LDA on NIST 08 truncated 10sec-10sec evaluation conditions, with the highest improvement obtained with the SUN-LDA-concat technique achieving a relative improvement of 8% in EER for mis-matched conditions and over 3% for matched conditions over traditional LDA approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A people-to-people matching system (or a match-making system) refers to a system in which users join with the objective of meeting other users with the common need. Some real-world examples of these systems are employer-employee (in job search networks), mentor-student (in university social networks), consume-to-consumer (in marketplaces) and male-female (in an online dating network). The network underlying in these systems consists of two groups of users, and the relationships between users need to be captured for developing an efficient match-making system. Most of the existing studies utilize information either about each of the users in isolation or their interaction separately, and develop recommender systems using the one form of information only. It is imperative to understand the linkages among the users in the network and use them in developing a match-making system. This study utilizes several social network analysis methods such as graph theory, small world phenomenon, centrality analysis, density analysis to gain insight into the entities and their relationships present in this network. This paper also proposes a new type of graph called “attributed bipartite graph”. By using these analyses and the proposed type of graph, an efficient hybrid recommender system is developed which generates recommendation for new users as well as shows improvement in accuracy over the baseline methods.