974 resultados para Diamond, Particle, Shape, Abrasion, Wear
Resumo:
We studied the influence of signal variability on human and model observers for detection tasks with realistic simulated masses superimposed on real patient mammographic backgrounds and synthesized mammographic backgrounds (clustered lumpy backgrounds, CLB). Results under the signal-known-exactly (SKE) paradigm were compared with signal-known-statistically (SKS) tasks for which the observers did not have prior knowledge of the shape or size of the signal. Human observers' performance did not vary significantly when benign masses were superimposed on real images or on CLB. Uncertainty and variability in signal shape did not degrade human performance significantly compared with the SKE task, while variability in signal size did. Implementation of appropriate internal noise components allowed the fit of model observers to human performance.
Resumo:
Brake wear particulate matter (PM) may provoke cardiovascular effects. A system was developed to expose cells to airborne PM from brakes. Six car models were tested, each with full stop and normal deceleration. PM numbers, mass and surface, metals, and carbon compounds were measured. Full stop produced higher PM number and mass concentrations than normal deceleration (up to 10 million particles/cm3 in 0.2 m3 volume). 87% of the PM mass was in the fine (100 nm to 2.5 ìm) and 12% in the coarse (2.5 to 10 ìm) fraction, whereas 74% of the PM number was nanoscaled (ultrafine < 0.1 ìm) and 26% fine PM. Elemental concentrations were 2,364, 236, and 18 ìg/m3 of iron, copper and manganese, respectively, and 664 and 36 ìg/m3 of organic and elemental carbon. PM-release differed between cars and braking behaviour. Temperature and humidity were stable. In conclusion, the established system seems feasible for exposing cell cultures to brake wear PM. [Authors]
Resumo:
Thumb hypoplasia treatment requires considering every component of the maldevelopment. Types II and IIIA hypoplasia share common features such as first web space narrowing, hypoplasia or absence of thenar muscles and metacarpophalangeal joint instability. Many surgical techniques to correct the malformation have been described. We report our surgical strategy that includes modifications of the usual technique that we found useful in reducing morbidity while optimizing the results. A diamond-shape kite flap was used to widen the first web space. Its design allowed primary closure of the donor site using a Dufourmentel flap. The ring finger flexor digitorum superficialis was transferred for opposition transfer, and the same tendon was used to stabilize the metacarpophalangeal joint on its ulnar and/or radial side depending on a uniplanar or more global instability. An omega-shaped K-wire was placed between the first and second metacarpals to maintain a wide opening of the first web space without stressing the reconstructed ulnar collateral ligament of the MCP joint. We report a clinical series of 15 patients (18 thumbs) who had this reconstructive program.
Resumo:
Wear of polyethylene is associated with aseptic loosening of orthopaedic implants and has been observed in hip and knee prostheses and anatomical implants for the shoulder. The reversed shoulder prostheses have not been assessed as yet. We investigated the volumetric polyethylene wear of the reversed and anatomical Aequalis shoulder prostheses using a mathematical musculoskeletal model. Movement and joint stability were achieved by EMG-controlled activation of the muscles. A non-constant wear factor was considered. Simulated activities of daily living were estimated from in vivo recorded data. After one year of use, the volumetric wear was 8.4 mm(3) for the anatomical prosthesis, but 44.6 mm(3) for the reversed version. For the anatomical prosthesis the predictions for contact pressure and wear were consistent with biomechanical and clinical data. The abrasive wear of the polyethylene in reversed prostheses should not be underestimated, and further analysis, both experimental and clinical, is required.
Resumo:
The dynamics of homogeneously heated granular gases which fragment due to particle collisions is analyzed. We introduce a kinetic model which accounts for correlations induced at the grain collisions and analyze both the kinetics and relevant distribution functions these systems develop. The work combines analytical and numerical studies based on direct simulation Monte Carlo calculations. A broad family of fragmentation probabilities is considered, and its implications for the system kinetics are discussed. We show that generically these driven materials evolve asymptotically into a dynamical scaling regime. If the fragmentation probability tends to a constant, the grain number diverges at a finite time, leading to a shattering singularity. If the fragmentation probability vanishes, then the number of grains grows monotonously as a power law. We consider different homogeneous thermostats and show that the kinetics of these systems depends weakly on both the grain inelasticity and driving. We observe that fragmentation plays a relevant role in the shape of the velocity distribution of the particles. When the fragmentation is driven by local stochastic events, the longvelocity tail is essentially exponential independently of the heating frequency and the breaking rule. However, for a Lowe-Andersen thermostat, numerical evidence strongly supports the conjecture that the scaled velocity distribution follows a generalized exponential behavior f (c)~exp (−cⁿ), with n ≈1.2, regarding less the fragmentation mechanisms
Resumo:
In medical imaging, merging automated segmentations obtained from multiple atlases has become a standard practice for improving the accuracy. In this letter, we propose two new fusion methods: "Global Weighted Shape-Based Averaging" (GWSBA) and "Local Weighted Shape-Based Averaging" (LWSBA). These methods extend the well known Shape-Based Averaging (SBA) by additionally incorporating the similarity information between the reference (i.e., atlas) images and the target image to be segmented. We also propose a new spatially-varying similarity-weighted neighborhood prior model, and an edge-preserving smoothness term that can be used with many of the existing fusion methods. We first present our new Markov Random Field (MRF) based fusion framework that models the above mentioned information. The proposed methods are evaluated in the context of segmentation of lymph nodes in the head and neck 3D CT images, and they resulted in more accurate segmentations compared to the existing SBA.
Resumo:
Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).
Resumo:
The shape of alliance processes over the course of psychotherapy has already been studied in several process-outcome studies on very brief psychotherapy. The present study applies the shape-of-change methodology to short-term dynamic psychotherapies and complements this method with hierarchical linear modeling. A total of 50 psychotherapies of up to 40 sessions were included. Alliance was measured at the end of each session. The results indicate that a linear progression model is most adequate. Three main patterns were found: stable, linear, and quadratic growth. The linear growth pattern, along with the slope parameter, was related to treatment outcome. This study sheds additional light on alliance process research, underscores the importance of linear alliance progression for outcome, and also fosters a better understanding of its limitations.
Resumo:
The distribution of transposable elements (TEs) in a genome reflects a balance between insertion rate and selection against new insertions. Understanding the distribution of TEs therefore provides insights into the forces shaping the organization of genomes. Past research has shown that TEs tend to accumulate in genomic regions with low gene density and low recombination rate. However, little is known about the factors modulating insertion rates across the genome and their evolutionary significance. One candidate factor is gene expression, which has been suggested to increase local insertion rate by rendering DNA more accessible. We test this hypothesis by comparing the TE density around germline- and soma-expressed genes in the euchromatin of Drosophila melanogaster. Because only insertions that occur in the germline are transmitted to the next generation, we predicted a higher density of TEs around germline-expressed genes than soma-expressed genes. We show that the rate of TE insertions is greater near germline- than soma-expressed genes. However, this effect is partly offset by stronger selection for genome compactness (against excess noncoding DNA) on germline-expressed genes. We also demonstrate that the local genome organization in clusters of coexpressed genes plays a fundamental role in the genomic distribution of TEs. Our analysis shows that-in addition to recombination rate-the distribution of TEs is shaped by the interaction of gene expression and genome organization. The important role of selection for compactness sheds a new light on the role of TEs in genome evolution. Instead of making genomes grow passively, TEs are controlled by the forces shaping genome compactness, most likely linked to the efficiency of gene expression or its complexity and possibly their interaction with mechanisms of TE silencing.
Resumo:
Particle physics studies highly complex processes which cannot be directly observed. Scientific realism claims that we are nevertheless warranted in believing that these processes really occur and that the objects involved in them really exist. This dissertation defends a version of scientific realism, called causal realism, in the context of particle physics. I start by introducing the central theses and arguments in the recent philosophical debate on scientific realism (chapter 1), with a special focus on an important presupposition of the debate, namely common sense realism. Chapter 2 then discusses entity realism, which introduces a crucial element into the debate by emphasizing the importance of experiments in defending scientific realism. Most of the chapter is concerned with Ian Hacking's position, but I also argue that Nancy Cartwright's version of entity realism is ultimately preferable as a basis for further development. In chapter 3,1 take a step back and consider the question whether the realism debate is worth pursuing at all. Arthur Fine has given a negative answer to that question, proposing his natural ontologica! attitude as an alternative to both realism and antirealism. I argue that the debate (in particular the realist side of it) is in fact less vicious than Fine presents it. The second part of my work (chapters 4-6) develops, illustrates and defends causal realism. The key idea is that inference to the best explanation is reliable in some cases, but not in others. Chapter 4 characterizes the difference between these two kinds of cases in terms of three criteria which distinguish causal from theoretical warrant. In order to flesh out this distinction, chapter 5 then applies it to a concrete case from the history of particle physics, the discovery of the neutrino. This case study shows that the distinction between causal and theoretical warrant is crucial for understanding what it means to "directly detect" a new particle. But the distinction is also an effective tool against what I take to be the presently most powerful objection to scientific realism: Kyle Stanford's argument from unconceived alternatives. I respond to this argument in chapter 6, and I illustrate my response with a discussion of Jean Perrin's experimental work concerning the atomic hypothesis. In the final part of the dissertation, I turn to the specific challenges posed to realism by quantum theories. One of these challenges comes from the experimental violations of Bell's inequalities, which indicate a failure of locality in the quantum domain. I show in chapter 7 how causal realism can further our understanding of quantum non-locality by taking account of some recent experimental results. Another challenge to realism in quantum mechanics comes from delayed-choice experiments, which seem to imply that certain aspects of what happens in an experiment can be influenced by later choices of the experimenter. Chapter 8 analyzes these experiments and argues that they do not warrant the antirealist conclusions which some commentators draw from them. It pays particular attention to the case of delayed-choice entanglement swapping and the corresponding question whether entanglement is a real physical relation. In chapter 9,1 finally address relativistic quantum theories. It is often claimed that these theories are incompatible with a particle ontology, and this calls into question causal realism's commitment to localizable and countable entities. I defend the commitments of causal realism against these objections, and I conclude with some remarks connecting the interpretation of quantum field theory to more general metaphysical issues confronting causal realism.
Resumo:
Tämän insinöörityön tarkoituksena on kehittää avaruusinstrumentti, joka on osa Euroopan avaruusjärjestön ESA:n ja Japanin avaruusjärjestön JAXA:n BepiColombo-yhteistyöhanketta. Satelliitti lähetetään Merkurius-planeetan kiertoradalle vuonna 2013. Avaruusaluksen matka Merkuriukseen kestää yhteensä kuusi vuotta ja on perillä vuonna 2019. Yksi BepiColombo-satelliitin tieteellisistä instrumenteista on Oxford Instruments Analytical Oy:n kehittämä SIXS-instrumentti (Solar Intensity X-ray and particle Spectrometer). Instrumentin tarkoituksena on mitata auringosta tulevaa röntgen- ja partikkelisäteilyä. Se toimii yhteistyössä Merkuriuksen pintaa mittaavan MIXS-instrumentin (Mercury Imaging X-ray Spectrometer) kanssa. Tuloksista pystytään analysoimaan ne alkuaineet, joista Merkuriuksen pinta koostuu. Työn alussa esitellään teoriataustaa alkuaineiden mittauksesta niiltä osin, kuin se tämän työn kannalta on tarpeellista. Työssä syvennytään tarkemmin auringosta tulevan säteilyn mittauksesta vastaavan instrumentin tekniikkaan ja mekaniikkasuunnitteluun. Instrumentin lämpöteknisestä suunnittelusta, värähtelymittauksista ja lujuusanalyysista on työhön sisällytetty pääasiat. Työn tuloksena on kehitetty instrumenttiin tulevan partikkelidetektorin prototyyppi sekä instrumenttikotelon malli. Lopullisen koon instrumenttikotelolle määrittää vaadittavan elektroniikan viemä tila. Mittalaitteen kehitystyö jatkuu Oxford Instruments Analytical Oy:ssä vuoteen 2011 saakka.