751 resultados para Segmented polyurethanes
Resumo:
A neural model is proposed of how laminar interactions in the visual cortex may learn and recognize object texture and form boundaries. The model brings together five interacting processes: region-based texture classification, contour-based boundary grouping, surface filling-in, spatial attention, and object attention. The model shows how form boundaries can determine regions in which surface filling-in occurs; how surface filling-in interacts with spatial attention to generate a form-fitting distribution of spatial attention, or attentional shroud; how the strongest shroud can inhibit weaker shrouds; and how the winning shroud regulates learning of texture categories, and thus the allocation of object attention. The model can discriminate abutted textures with blurred boundaries and is sensitive to texture boundary attributes like discontinuities in orientation and texture flow curvature as well as to relative orientations of texture elements. The model quantitatively fits a large set of human psychophysical data on orientation-based textures. Object boundar output of the model is compared to computer vision algorithms using a set of human segmented photographic images. The model classifies textures and suppresses noise using a multiple scale oriented filterbank and a distributed Adaptive Resonance Theory (dART) classifier. The matched signal between the bottom-up texture inputs and top-down learned texture categories is utilized by oriented competitive and cooperative grouping processes to generate texture boundaries that control surface filling-in and spatial attention. Topdown modulatory attentional feedback from boundary and surface representations to early filtering stages results in enhanced texture boundaries and more efficient learning of texture within attended surface regions. Surface-based attention also provides a self-supervising training signal for learning new textures. Importance of the surface-based attentional feedback in texture learning and classification is tested using a set of textured images from the Brodatz micro-texture album. Benchmark studies vary from 95.1% to 98.6% with attention, and from 90.6% to 93.2% without attention.
Resumo:
An improved Boundary Contour System (BCS) and Feature Contour System (FCS) neural network model of preattentive vision is applied to two large images containing range data gathered by a synthetic aperture radar (SAR) sensor. The goal of processing is to make structures such as motor vehicles, roads, or buildings more salient and more interpretable to human observers than they are in the original imagery. Early processing by shunting center-surround networks compresses signal dynamic range and performs local contrast enhancement. Subsequent processing by filters sensitive to oriented contrast, including short-range competition and long-range cooperation, segments the image into regions. Finally, a diffusive filling-in operation within the segmented regions produces coherent visible structures. The combination of BCS and FCS helps to locate and enhance structure over regions of many pixels, without the resulting blur characteristic of approaches based on low spatial frequency filtering alone.
Resumo:
Introduction: Copayments for prescriptions are associated with decreased adherence to medicines resulting in increased health service utilisation, morbidity and mortality. In October 2010 a 50c copayment per prescription item was introduced on the General Medical Services (GMS) scheme in Ireland, the national public health insurance programme for low-income and older people. The copayment was increased to €1.50 per prescription item in January 2013. To date, the impact of these copayments on adherence to prescription medicines on the GMS scheme has not been assessed. Given that the GMS population comprises more than 40% of the Irish population, this presents an important public health problem. The aim of this thesis was to assess the impact of two prescription copayments, 50c and €1.50, on adherence to medicines.Methods: In Chapter 2 the published literature was systematically reviewed with meta-analysis to a) develop evidence on cost-sharing for prescriptions and adherence to medicines and b) develop evidence for an alternative policy option; removal of copayments. The core research question of this thesis was addressed by a large before and after longitudinal study, with comparator group, using the national pharmacy claims database. New users of essential and less-essential medicines were included in the study with sample sizes ranging from 7,007 to 136,111 individuals in different medication groups. Segmented regression was used with generalised estimating equations to allow for correlations between repeated monthly measurements of adherence. A qualitative study involving 24 individuals was conducted to assess patient attitudes towards the 50c copayment policy. The qualitative and quantitative findings were integrated in the discussion chapter of the thesis. The vast majority of the literature on this topic area is generated in North America, therefore a test of generalisability was carried out in Chapter 5 by comparing the impact of two similar copayment interventions on adherence, one in the U.S. and one in Ireland. The method used to measure adherence in Chapters 3 and 5 was validated in Chapter 6. Results: The systematic review with meta-analysis demonstrated an 11% (95% CI 1.09 to 1.14) increased odds of non-adherence when publicly insured populations were exposed to copayments. The second systematic review found moderate but variable improvements in adherence after removal/reduction of copayments in a general population. The core paper of this thesis found that both the 50c and €1.50 copayments on the GMS scheme were associated with larger reductions in adherence to less-essential medicines than essential medicines directly after the implementation of policies. An important exception to this pattern was observed; adherence to anti-depressant medications declined by a larger extent than adherence to other essential medicines after both copayments. The cross country comparison indicated that North American evidence on cost-sharing for prescriptions is not automatically generalisable to the Irish setting. Irish patients had greater immediate decreases of -5.3% (95% CI -6.9 to -3.7) and -2.8% (95% CI -4.9 to -0.7) in adherence to anti-hypertensives and anti-hyperlipidaemic medicines, respectively, directly after the policy changes, relative to their U.S. counterparts. In the long term, however, the U.S. and Irish populations had similar behaviours. The concordance study highlighted the possibility of a measurement bias occurring for the measurement of adherence to non-steroidal anti-inflammatory drugs in Chapter 3. Conclusions: This thesis has presented two reviews of international cost-sharing policies, an assessment of the generalisability of international evidence and both qualitative and quantitative examinations of cost-sharing policies for prescription medicines on the GMS scheme in Ireland. It was found that the introduction of a 50c copayment and its subsequent increase to €1.50 on the GMS scheme had a larger impact on adherence to less-essential medicines relative to essential medicines, with the exception of anti-depressant medications. This is in line with policy objectives to reduce moral hazard and is therefore demonstrative of the value of such policies. There are however some caveats. The copayment now stands at €2.50 per prescription item. The impact of this increase in copayment has yet to be assessed which is an obvious point for future research. Careful monitoring for adverse effects in socio-economically disadvantaged groups within the GMS population is also warranted. International evidence can be applied to the Irish setting to aid in future decision making in this area, but not without placing it in the local context first. Patients accepted the introduction of the 50c charge, however did voice concerns over a rising price. The challenge for policymakers is to find the ‘optimal copayment’ – whereby moral hazard is decreased, but access to essential chronic disease medicines that provide advantages at the population level is not deterred. This evidence presented in this thesis will be utilisable for future policy-making in Ireland.
Resumo:
We report a new inkless catalytic muCP technique that achieves accurate, fast, and complete pattern reproduction on SAMs of Boc- and TBS-protected thiols immobilized on gold using a polyurethane-acrylate stamp functionalized with covalently bound sulfonic acids. Pattern transfer is complete at room temperature just after one minute of contact and renders sub-200 nm size structures of chemically differentiated SAMs.
Resumo:
Commercially available implantable needle-type glucose sensors for diabetes management are robust analytically but can be unreliable clinically primarily due to tissue-sensor interactions. Here, we present the physical, drug release and bioactivity characterization of tubular, porous dexamethasone (Dex)-releasing polyurethane coatings designed to attenuate local inflammation at the tissue-sensor interface. Porous polyurethane coatings were produced by the salt-leaching/gas-foaming method. Scanning electron microscopy and micro-computed tomography (micro-CT) showed controlled porosity and coating thickness. In vitro drug release from coatings monitored over 2 weeks presented an initial fast release followed by a slower release. Total release from coatings was highly dependent on initial drug loading amount. Functional in vitro testing of glucose sensors deployed with porous coatings against glucose standards demonstrated that highly porous coatings minimally affected signal strength and response rate. Bioactivity of the released drug was determined by monitoring Dex-mediated, dose-dependent apoptosis of human peripheral blood derived monocytes in culture. Acute animal studies were used to determine the appropriate Dex payload for the implanted porous coatings. Pilot short-term animal studies showed that Dex released from porous coatings implanted in rat subcutis attenuated the initial inflammatory response to sensor implantation. These results suggest that deploying sensors with the porous, Dex-releasing coatings is a promising strategy to improve glucose sensor performance.
Resumo:
BACKGROUND: The bioluminescence technique was used to quantify the local glucose concentration in the tissue surrounding subcutaneously implanted polyurethane material and surrounding glucose sensors. In addition, some implants were coated with a single layer of adipose-derived stromal cells (ASCs) because these cells improve the wound-healing response around biomaterials. METHODS: Control and ASC-coated implants were implanted subcutaneously in rats for 1 or 8 weeks (polyurethane) or for 1 week only (glucose sensors). Tissue biopsies adjacent to the implant were immediately frozen at the time of explant. Cryosections were assayed for glucose concentration profile using the bioluminescence technique. RESULTS: For the polyurethane samples, no significant differences in glucose concentration within 100 μm of the implant surface were found between bare and ASC-coated implants at 1 or 8 weeks. A glucose concentration gradient was demonstrated around the glucose sensors. For all sensors, the minimum glucose concentration of approximately 4 mM was found at the implant surface and increased with distance from the sensor surface until the glucose concentration peaked at approximately 7 mM at 100 μm. Then the glucose concentration decreased to 5.5-6.5 mM more than 100 μmm from the surface. CONCLUSIONS: The ASC attachment to polyurethane and to glucose sensors did not change the glucose profiles in the tissue surrounding the implants. Although most glucose sensors incorporate a diffusion barrier to reduce the gradient of glucose and oxygen in the tissue, it is typically assumed that there is no steep glucose gradient around the sensors. However, a glucose gradient was observed around the sensors. A more complete understanding of glucose transport and concentration gradients around sensors is critical.
Resumo:
Although the release of nitric oxide (NO) from biomaterials has been shown to reduce the foreign body response (FBR), the optimal NO release kinetics and doses remain unknown. Herein, polyurethane-coated wire substrates with varying NO release properties were implanted into porcine subcutaneous tissue for 3, 7, 21 and 42 d. Histological analysis revealed that materials with short NO release durations (i.e., 24 h) were insufficient to reduce the collagen capsule thickness at 3 and 6 weeks, whereas implants with longer release durations (i.e., 3 and 14 d) and greater NO payloads significantly reduced the collagen encapsulation at both 3 and 6 weeks. The acute inflammatory response was mitigated most notably by systems with the longest duration and greatest dose of NO release, supporting the notion that these properties are most critical in circumventing the FBR for subcutaneous biomedical applications (e.g., glucose sensors).
Resumo:
Intraoperative assessment of surgical margins is critical to ensuring residual tumor does not remain in a patient. Previously, we developed a fluorescence structured illumination microscope (SIM) system with a single-shot field of view (FOV) of 2.1 × 1.6 mm (3.4 mm2) and sub-cellular resolution (4.4 μm). The goal of this study was to test the utility of this technology for the detection of residual disease in a genetically engineered mouse model of sarcoma. Primary soft tissue sarcomas were generated in the hindlimb and after the tumor was surgically removed, the relevant margin was stained with acridine orange (AO), a vital stain that brightly stains cell nuclei and fibrous tissues. The tissues were imaged with the SIM system with the primary goal of visualizing fluorescent features from tumor nuclei. Given the heterogeneity of the background tissue (presence of adipose tissue and muscle), an algorithm known as maximally stable extremal regions (MSER) was optimized and applied to the images to specifically segment nuclear features. A logistic regression model was used to classify a tissue site as positive or negative by calculating area fraction and shape of the segmented features that were present and the resulting receiver operator curve (ROC) was generated by varying the probability threshold. Based on the ROC curves, the model was able to classify tumor and normal tissue with 77% sensitivity and 81% specificity (Youden's index). For an unbiased measure of the model performance, it was applied to a separate validation dataset that resulted in 73% sensitivity and 80% specificity. When this approach was applied to representative whole margins, for a tumor probability threshold of 50%, only 1.2% of all regions from the negative margin exceeded this threshold, while over 14.8% of all regions from the positive margin exceeded this threshold.
Resumo:
Electromagnetic processing of liquid metals involves dynamic change of the fluid volume interfacing with a melting solid material, gas or vacuum, and possibly a different liquid. Electromagnetic field and the associated force field are strongly coupled to the free surface dynamics and the heat-mass transfer. We present practical modelling examples of the flow and heat transfer using an accurate pseudo-spectral code and the k-omega turbulence model suitable for complex and transitional flows with free surfaces. The 'cold crucible' melting is modelled dynamically including the melting front gradual propagation and the magnetically confined free surrounding interface. Intermittent contact with the water-cooled segmented wall and the radiation heat losses are parts of the complex problem.
Resumo:
Induction heating is an efficient method used to melt electrically conductive materials, particularly if melting takes place in a ceramic crucible. This form of melting is particularly good for alloys, as electromagnetic forces set up by the induction coil lead to vigorous stirring of the melt ensuring homogeneity and uniformity in temperature. However, for certain reactive alloys, or where high purity is required, ceramic crucibles cannot be used, but a water-cooled segmented copper crucible is employed instead. Water cooling prevents meltdown or distortion of the metal wall, but much of the energy goes into the coolant. To reduce this loss, the electromagnetic force generated by the coil is used to push the melt away from the walls and so minimise contact with water-cooled surfaces. Even then, heat is lost through the crucible base where contact is inevitable. In a collaborative programme between Greenwich and Birmingham Universities, computer modelling has been used in conjunction with experiments to improve the superheat attainable in the melt for a,number of alloys, especially for y-TiAl intermetallics to cast aeroengine turbine blades. The model solves the discretised form of the turbulent Navier-Stokes, thermal energy conservation and Maxwell equations using a Spectral Collocation technique. The time-varying melt envelope is followed explicitly during the computation using an adaptive mesh. This paper briefly describes the mathematical model used to represent the interaction between the magnetic field, fluid flow, heat transfer and change of phase in the crucible and identifies the proportions of energy used in the melt, lost in the crucible base and in the crucible walls. The role of turbulence is highlighted as important in controlling heat losses and turbulence damping is introduced as a means of improving superheat. Model validation is against experimental results and shows good agreement with measured temperatures and energy losses in the cooling fluid throughout the melting cycle.
Resumo:
Purpose – A small size cold crucible offers possibilities for melting various electrically conducting materials with a minimal wall contact. Such small samples can be used for express contamination analysis, preparing limited amounts of reactive alloys or experimental material analyses. Aims to present a model to follow the melting process. Design/methodology/approach – The presents a numerical model in which different types of axisymmetric coil configurations are analysed. Findings – The presented numerical model permits dynamically to follow the melting process, the high-frequency magnetic field distribution change, the free surface and the melting front evolution, and the associated turbulent fluid dynamics. The partially solidified skin on the contact to the cold crucible walls and bottom is dynamically predicted. The segmented crucible shape is either cylindrical, hemispherical or arbitrary shaped. Originality/value – The model presented within the paper permits the analysis of melting times, melt shapes, electrical efficiency and particle tracks.
Resumo:
A practical analytical workshop at NIOZ (Royal Netherlands Institute for Sea Research), The Netherlands, was held on 12-15 November 2012. The aim of the workshop was to gain information from the global nutrient analytical community about general problems which arise when measuring nutrients, and then to attempt to investigate these problems in the laboratory, with a small select representative group of International nutrient analysts conducting the lab work. 18 experts were participated and worked simultaneously on four different PO4 gas segmented CFA systems. This report documents the finding of the workshop and describes recommendations based on group consensus which can hopefully assist the larger community of labs worldwide participating in the Inter-Laboratory Comparison RMNS 2012 studies organized by MRI in Japan.
Resumo:
Particles of most virus species accurately package a single genome, but there are indications that the pleomorphic particles of parainfluenza viruses incorporate multiple genomes. We characterized a stable measles virus mutant that efficiently packages at least two genomes. The first genome is recombinant and codes for a defective attachment protein with an appended domain interfering with fusion-support function. The second has one adenosine insertion in a purine run that interrupts translation of the appended domain and restores function. In that genome, a one base deletion in a different purine run abolishes polymerase synthesis, but restores hexameric genome length, thus ensuring accurate RNA encapsidation, which is necessary for efficient replication. Thus, the two genomes are complementary. The infection kinetics of this mutant indicate that packaging of multiple genomes does not negatively affect growth. We also show that polyploid particles are produced in standard infections at no expense to infectivity. Our results illustrate how the particles of parainfluenza viruses efficiently accommodate cargoes of different volume, and suggest a mechanism by which segmented genomes may have evolved.
Resumo:
Our understanding of how the visual system processes motion transparency, the phenomenon by which multiple directions of motion are perceived to co-exist in the same spatial region, has grown considerably in the past decade. There is compelling evidence that the process is driven by global-motion mechanisms. Consequently, although transparently moving surfaces are readily segmented over an extended space, the visual system cannot separate two motion signals that co-exist in the same local region. A related issue is whether the visual system can detect transparently moving surfaces simultaneously, or whether the component signals encounter a serial â??bottleneckâ?? during their processing? Our initial results show that, at sufficiently short stimulus durations, observers cannot accurately detect two superimposed directions; yet they have no difficulty in detecting one pattern direction in noise, supporting the serial-bottleneck scenario. However, in a second experiment, the difference in performance between the two tasks disappears when the component patterns are segregated. This discrepancy between the processing of transparent and non-overlapping patterns may be a consequence of suppressed activity of global-motion mechanisms when the transparent surfaces are presented in the same depth plane. To test this explanation, we repeated our initial experiment while separating the motion components in depth. The marked improvement in performance leads us to conclude that transparent motion signals are represented simultaneously.
Resumo:
The debate over the possible extension of transparency regulation in Europe to include sovereign bonds has opened up a number of other issues in need of serious consideration. One such issue is the appropriateness of the entire infrastructure supporting the trading of European sovereign bonds. In recent years sovereign issuers have supported the development of an electronic inter-dealer market but have remained unconcerned with the opacity of dealer-to-customer trading. The degree of segmentation in this market is high relative to what exists in nearly all other financial markets. This paper explores why European sovereign bond markets have developed in such a segmented way and considers how this structure could be altered to improve transparency without adversely affecting liquidity, efficiency or the benefits enjoyed by primary dealers and issuers. It is suggested that the structure of the market could be improved greatly if the largest and most active investors were permitted access to the inter-dealer electronic trading platforms. This would solve a number of market imperfections and increase the proportion of market activity that is conducted in a transparent way. The paper argues that sovereign issuers in Europe have the means to provide incentives that would influence dealers to support reduced segmentation. Some practical examples of how this could be achieved are provided and the potential benefits are outlined.